var/home/core/zuul-output/0000755000175000017500000000000015144671101014525 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015144673705015505 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000215106615144673534020275 0ustar corecore\wikubelet.log_o[;r)Br'o-n(!9t%Cs7}g/غIs,r.k9GfB >"mv?_eGbuuțx{w7ݭ7֫L% oo/q3m^]/o?8.7oW}ʋghewx/mX,ojŻ ^Tb3b#׳:}=p7뼝ca㑔`e0I1Q!&ѱ[/o^{W-{t3_U|6 x)K#/5ΌR"ggóisR)N %emOQ/Ϋ[oa0vs68/Jʢ ܚʂ9ss3+aô٥J}{37FEbп3 FKX1QRQlrTvb)E,s)Wɀ;$#LcdHMeBmFR5]!PI6f٘"y/(":[#;`1}+7 s'ϨF&%8'# $9b"r>B)GF%\bi/ Ff/Bp 4YH~BŊ6EZ|^߸3%L[EC 7gg/碓@e=Vn)h\\lwCzDiQJxTsL] ,=M`nͷ~Vܯ5n|X&pNz7l9HGAr Mme)M,O!Xa~YB ɻ!@J$ty#&i 5ܘ=ЂK]IIɻ]rwbXh)g''H_`!GKF5/O]Zڢ>:O񨡺ePӋ&56zGnL!?lJJYq=Wo/"IyQ4\:y|6h6dQX0>HTG5QOuxMe 1׶/5άRIo>a~W;D=;y|AAY'"葋_d$Ə{(he NSfX1982TH#D֪v3l"<, { Tms'oI&'Adp]{1DL^5"Ϧޙ`F}W5XDV7V5EE9esYYfiMOV i/ f>3VQ 7,oTW⇊AqO:rƭĘ DuZ^ To3dEN/} fI+?|Uz5SUZa{P,97óI,Q{eNFV+(hʺb ״ʻʞX6ýcsT z`q 0C?41- _n^ylSO2|'W'BOTLl-9Ja [$3BV2DC4l!TO C*Mrii1f5 JA *#jv߿Imy%u LOL8c3ilLJ!Ip,2(( *%KGj   %*e5-wFp"a~fzqu6tY,d,`!qIv꜒"T[1!I!NwL}\|}.b3oXR\(L _nJB/_xY.# ſԸv}9U}'/o uSH<:˷tGLS0l/LKcQ.os2% t)Eh~2p cL1%'4-1a_`[Zz㧦|k˭c ĚOρ_} Ewt3th?tvͪ{~;J0= |JUԍ;Iw}/9nh7l%>'ct Հ}a>-:(QxPyA Z UcÖgڌ:8cΗ|U1,-N9 dI [@3YN%:ò6PT:”QVay 77ĐrijVu)X9 f΁U ~5batx|ELU:T'T[G*ݧ ؽZK̡O6rLmȰ (T$ n#b@hpj:˾ojs)M/8`$:) X+ҧSaۥzw}^P1J%+P:Dsƫ%z; +g 0հc0E) 3jƯ?e|miȄ{g6R/wD_tՄ.F+HP'AE; J j"b~\b$BrW XWz<%fpG"m%6PGEH^*JL֗J)oEv[Ң߃x[䚒}0B fU3U;,ד1t7lJ#wՆ;I|p"+I4ˬZcն a.1wXhxDI:;.^m9W_c.4z+ϟMn?!ԫ5H&=JkܓhkB\LQ"<LxeLo4l_m24^3.{oɼʪ~75/nQ?s d|pxu\uw?=QR -Mݞίk@Pc n1æ*m$=4Dbs+J \EƄզ}@۶(ޓmmU̸nG pHKZ{{Qo}i¿Xc\]e1e,5`te.5Hhao<[50wMUF􀍠PV?Yg"ź)\3mf|ܔMUiU|Ym! #'ukMmQ9Blm]TO1ba.XW x6ܠ9[v35H;-]Um4mMrW-k#~fؤϋu_j*^Wj^qM `-Pk.@%=X#|ۡb1lKcj$׋bKv[~"N jS4HOkeF3LPyi︅iWk! cAnxu6<7cp?WN $?X3l(?  'Z! ,Z.maO_Bk/m~ޖ(<qRfR"Au\PmLZ"twpuJ` mvf+T!6Ѓjw1ncuwo':o gSPC=]U҅yY9 &K<-na'Xk,P4+`Þ/lX/bjFO.= w ?>ȑ3n߿z,t s5Z/ Clo-` z?a~b mzkC zFȏ>1k*Dls6vP9hS  ehC.3 @6ijvUuBY hBnb[ Fr#D7ćlA!:X lYE>#0JvʈɌ|\u,'Y˲.,;oOwoj-25Hݻ7 li0bSlbw=IsxhRbd+I]Y]JP}@.供SЃ??w w@KvKts[TSa /ZaDžPAEư07>~w3n:U/.P珀Yaٳ5Ʈ]խ4 ~fh.8C>n@T%W?%TbzK-6cb:XeGL`'žeVVޖ~;BLv[n|viPjbMeO?!hEfޮ])4 ?KN1o<]0Bg9lldXuT ʑ!Iu2ʌnB5*<^I^~G;Ja߄bHȌsK+D"̽E/"Icƀsu0,gy(&TI{ U܋N5 l͖h"褁lm *#n/Q!m b0X3i)\IN˭% Y&cKoG w 9pM^WϋQf7s#bd+SDL ,FZ<1Kx&C!{P|Ռr,* ] O;*X]Eg,5,ouZm8pnglVj!p2֬uT[QyB402|2d5K: `Bcz|Rxxl3{c` 1nhJzQHv?hbºܞz=73qSO0}Dc D]ͺjgw07'㤸z YJ\Hb9Ɖ„2Hi{(2HFE?*w*hy4ޙM^٫wF(p]EwQzr*! 5F XrO7E[!gJ^.a&HߣaaQÝ$_vyz4}0!yܒ栒޹a% Ŋ X!cJ!A\ ?E\R1 q/rJjd A4y4c+bQ̘TT!kw/nb͵FcRG0xeO sw5TV12R7<OG5cjShGg/5TbW > ]~Wޠ9dNiee$V[\[Qp-&u~a+3~;xUFFW>'ǣC~방u)т48ZdH;j a]`bGԹ#qiP(yڤ~dO@wA[Vz/$NW\F?H4kX6)F*1*(eJAaݡ krqB}q^fn 8y7P  GRޠkQn>eqQntq"Occ°NRjg#qSn02DŔw:ؽ 5l)Fa/TTmCԤ{"9b{ywSXE*m#3U ùRIvޏrJ`k|wJKH:O*OKy`( ݢe*{ ua ȻݔhvOkU~OǠI/aǕ-JMX _.6KsjA Qsmd  O#F.Uf28ZAgy>y,d$C?v01q5e.Um>]RLa&r?+@6k&#l)I5_> ` D s5npo}/ؙq #a2V?X~.4O/'|/_|&q̑0dd4>vk 60D _o~[Sw3ckpkpLNa ^j 5*<&}kˢmqvۗj=<Tr=[ a^؃ È(<^=xZb [_tܡ&yЋ{ Sym^?̑sU~' Ԓ f\itu)b>5X -$wMm[eG`̵E$uLrk-$_{$# $B*hN/ٟPE[/Y5d{zrBܖ6Hlc "mKv~[uLU4lZ;xEN'oI㤛rP*jC# 6@dmHg1$ʇȠh#CBΤ{sTQ{%w)7@y1K^ ].Y$46[B-3%OONw8d`Q4d$x0t8@t]y1T\YAidtxBG:pɨyeNg4n]M؞ e}Wn6׳i~'ہZ*FU{fXڃP'Hl4 ,ŸqMHDCYZz Qnz܁$Jp04ȴIL΃.0FiO-qy)i_TA|S2G4miBȨHM(2hys|F 94 DNlϒòκ-q|xC ,gKDzHR%t+E/wd#礱ºȄWEz o\JξB.wLKZ39(M +(PWՇfR6#ю3Ȋt ݪbh]MTw䀩S]'qf&)-_G;"1qz퇛0,#yiq$ՁɄ)KٮޓJ|̖D?:3mhW=rOf'/wѹ8BS8]`;=?,ڼ"ϴq*(A7? /W= #^ub"6q f+=^OI@߱^F[n4A#bYѤwd)J^Z{*ǥzw73LuaVad=$6)iI gC~.1%YmҪ+2gSt!8iIۛ*JgE7LGoş\bC}O i ycK1YhO6 /g:KT sPv6l+uN|!"VS^΄t*3b\N7dYܞLcn3rnNd8"is"1- ޑܧd[]~:'#;N(NknfV('I rcj2J1G<5 Nj̒Qh]ꍾZBn&Un' CyUM0nCj.&Oڣg\q0^Ϻ%4i" ZZG>Xr'XKc$2iσֹH<6N8HSg>uMik{Fm(W F@@{W+ߑ?X2hS4-=^YgpUHެbZ!y!ul@ڼ63" ۩:6=TZõ$E,ϓRV|G&$rr;J TtIHFE=RȬ]P pLm|?$%>Eü%mWO[>Xmw,*9.[G n >X8Ī;xW%dT:`ٓ~:ֿS5KC߄":{H)"%,!3w{"ZWÂk>/F?RJ>FIY*%5Hg}3Ď89؟N/pgÞ tJXB-Gjsٶ 3Gzp؍H|*cyp@\첹,[up`uV,\KCB\qGiW痃[?i?S{eϻl71X:݌>EEly(*SHN:ӫOq{{L$?Q{϶(F_Ej>3mqfΤP-j)H˧&8?a?2xĐ+EV؍x0bv6 fd1^ 2ӎԥ sZR cgu/bn/34'h9Dݥ:U:vV[ 'Mȥ@ەX㧿-p0?Q6 y2XN2_h~Cֆ֙82)=Ȓ7D- V)T? O/VFeUk'7KIT, WeՔ}-66V؅ʹ;T$pZ#@L; ?0]"2v[hׂ'cJ6H4bs+3(@z$.K!#Šj2ݢxK-di +9Hᇷ絻+ O.i2.I+69EVyw8//|~<ëng)P<xͯ~? fp,CǴ_BjDN^5)s('cBh+6ez0)_~zJz"ё`Z&Z![0rGBK 5G~<:H~W>;ٍVnSt%_!BZMMeccBҎÒJH+"ūyR}X~juPp- j\hЪQxchKaS,xS"cV8i8'-sOKB<չw"|{/MC8&%Og3E#O%`N)p#4YUh^ ɨڻ#Ch@(R &Z+<3ݰb/St=&yo|BL,1+t C<ˉvRfQ*e"T:*Dᰤ*~IClz^F6!ܠqK3%$E)~?wy,u'u() C>Gn} t]2_}!1NodI_Bǂ/^8\3m!'(Ֆ5Q&xo 8;'Jbo&XL_ʣ^^"Lq2E3,v1ɢu^}G7Z/qC^'+HDy=\]?d|9i,p?߼=\Ce"|Rݷ Q+=zxB.^Bld.HSntºB4~4]%.i|҂"? ~#ݤ[tfv3Ytck0O ͧ gP\|bЯ݃5H+v}/{&Ά+4"^'1lfYuB!ل[^kIf#[8>VG{;4^l;Pclů ՀCxCSt)6fm'R(*d.^Aw %"nluvOeH=t)Hİd/D"-Ɩ:;8`vU~Ʉ>hX v#'$61ܒZ˜bK@*`*#QA 9WykGk,8]F6{ ^tSȻ \CPwoX"{!9V0tپ_`#U8VdD_GU9V ұ{q:ObUizs )B ۊiX- \X=8OZSܿ* %xbcDa.E h Ƶ:R .qɱmu$I8>^QUAZa$1aH_dx$1'/v^V!i rc/wvҍ$E ECl}U9D.))FIoU&֗K jlFrԋ7EDYpԝ-D\dyj荊=EEk[bØF˩ K9mUxBa"'8T[Jl /K/9,rBAj_TqǘP,:4%_0Eze&O./!Z&p:ˏ!_B{{s1>"=b'K>}|+Z ;au"N@# ń*3_{.g9| {b` N´Ztc> ײ5Kĸ{3Gl& KT1XWX8?C]~We$9{ -.DJ߫7?1a@P5B, ݖc}jcG'Xzө+al H d]k/I,k,ρ|`zR/$@8Vu^scG"E7]qU:ڽUyy >VCѻ!*?xYa8U`J/AcmM~}?yj8TR#s"Q.ϊ+Yrx+u6*27fǪC%+A~*Zآ'+)ܮCy*8¢/Ws1PH9pM.~9gǡs` sB!8 H5E*ep:|Xt`Z)|z&VBNYm\lBN}fU9O  g`֗<eq1.cMʂnB>+q"/[ڲ&Lh!*<-ol0qiԧ {E#A7eVG&EV$_gWb ۞. E[@'nf"db'Z q3)|x!mLaKfKyࢷgb񾍠z}(6>C,HI~'.Op% 8$ c*Dp*Cj|>z G` |]e*&q!t=|X!~ Pu(ǍUeS@%NբZ.SVw Ohgh uЍaRs ^d6GX^V;/+ sXju~NF^)!FLѲ?VK.qE╯YOȱ<~qWP~ RT4&+QR"tJ8ۭSOfbxRKz .[c&Mȩ f(M`,mMr1E L"8_SK$_#O;V taUێ]<&YnrꝤʇ)br\kB`X}nl}hSЕ΀ѩ آUzn޷(ȊD0N^`MDN74ТC>F-|$AZPB8dJU&4Իflq6TX)ى?wg6>r\5TT%~am.>!LcoJrKmzqvmz܅EAZ#u-9`x 92$4_!9WՠZ̓?Wnm>0Es%DƖ|2H\2+AaTaBˮ}L@dr_WԦc>IdA Od[=jlek=XJ|&+-T1n8TڎP$%s,qgt+ZSxToE7U9/nq.JY):Y:7AIU"cKӝ$'qo%\Q!%c5\Z9N4Zxz,dI*ƙ(EfE"`{ipEIՒ9| Olz3;QϢ*:]ք+I&s5w` q:CdʰH`X?"}B=-/M~C>''1RWX%2@KʸH'sۄ`gRpcf:|XUZ#OSt/G~-~o2:u)"\**vdC_ˆdvcƕMlA&HwlF@ա5+F>ΰ-q>0*Oѹ eO/I!m|xV&\b<9$4Nvm^آ]$GBoMjKٕy{H 31Հm-PġUX$[eR6Oœ-m~)-&!883\6y 8V p-lprG]斾-3jsqY~ sj\+9[rAJsT=~#0t2ެf¸DŽ,@2,?WYخNr<V` =V[B5!Z\ļǪ:0A*Iucv8\[|۶s L-ky{ K?_a2=c5%C\d\'2J1(Rve:<+A/VRy6 ö+ML-yz,ZlQ^oAnv-{)xǺ--pcl@Ֆ*Vߓ`ڄ(Nc읠}*وGmU`pi|5Ӄ~ &$yx `qJA"*1 [CplmWu'T0^!Yg5;о߾/5I7wfܛT'W__ݸ//w2weY~(nExuQEuQɥymLa^!JExsse7mqskע,./kuXusxnoUaymGCC 1/B_M{2\-> ]YpsqL5 څB ]>t)^v~(󿴛} 3+׫E7e8AdX-fzS烠1Uܦ$l;olq"ҕ^璓P åqw@gRCsT~$U>ceއE1ZI&[V=͋A,z`S,J|Lh/rʑ>}JXԻq̎'QIC(<ǩJq lc*~;YK OIXA|iޕ$"~ %3q&3Ɖwv10dS5oIɒce(VItTuD `1̳Oȡ*L&98Hp6E)CU|N%gCՅG$p QHV#ʢwKӇmGQ_>ħG.x2"Cu׬ 45mp0C'x$;ӔZRku H y(dH|cp>MݴtʴtJ=fY`hdqX-(l8EM$1Ld8pi_y|Ld2@ &"kW#O0tm6 T:(GzcMB /w;ludr݃ 7DtP&s_.s,ɸ8z3L {we,LW+4S DD ^'*|1zm=bch;sYz\j(Q(d#\yf1#B?Yasۊƞoơ{젙L@D$. IIc*7V4[Q&O]2 C4 gN<ze,r>| 0l4>"yQ)/:H*³ݟ''9"؝"DL y46aRzfw7Dw'aqp04!W1-ODfTS{gk|j^u %]TySDDMXdY& MKQUN1a:.5) h%Y{nY`}<mv)¦hT <c!6KY@Z>ʊM𤞝Dx}4ZO]f`z88CjSD(Oc5z#MQH;"QcZ7"ҚaE~AY-Ҽ4 x|>(]^ ⊖TGNB׃ki7"Q̓(sh<|dzED⩑eEKH$7|ўE<(l(Mp dԞ>>eN*LPw2)yu%#T5 cwF :\;d>Faۮ,%i+hOqԢ6}|Խ`[-ahTPGhja &L! 4F2U Ӗ*LnC TjFZ-8;ƚqRAjOXx9 ( oL<*/z7\\m߶)it t+};<4ʦ-7R-ޞ%^i}-E%zcCV"-㩨 Š\{xÌFY1u9̊L̦ ƕCӴgčXQԄӮҵ\ApHdKUK :+!ص5*kaIbռ -:[ADL@i3}jۊ8CyvOӷU Alfyp keQc*wBѹ'V zDRVx5=D̛>wC{5,T!Xk@4U'#7%+kIdRaNL\3wKfRhuɳJvS3Slطʳ<ؽeQVEVy -l^Gj39i YDfNoCw.Hj)cW>RSIuA$*"TIޟEk dE^Yu41%!e1lJF0U^mXMRiiuUa VhuET*ϪC$mU"ed'Mv-RbnkZ*k@"T JL"8iٺ/`]Drq pW [ߖr[ilA*o)gUqEypMeݻ "Ev׎aqW+Mu D6RiMI"!ʃ)d,ZkʄhlYuZ+eH>J mj8[byb=C#jo0VA%FGwq cD=z p b\|Q3@*GG5e2{K] I$SW%RͱxctqUXK%_xQUzk)c$dt\œ&CGFU{ߧ5JC1GbmbݵfC yU!V4x?0T *Za(#>瘆΃ݣ>7|RѴ'$ mnub1jf 7rir?˼8`oAžsxV0.<݉ȮfwM Prۆ l2opSkr7+!P*[O}UU-^&цcJ]:WR3@*ixtH˳d0~C![™;>Dʪ=4_/;DtV]咉-u)[1_ciP3mI\Uޣl/e>ѽC?NR2y^] .b朗=-Mg՞_e'0qaw6"; WțzUm<- rw RbuE$C$\G,/+ee.'tqU {hY ~x6?_{xUu)T4DֳHW`h ::~rcyc0GK!39B;ftS*Y#ST1MNߛ6hM]O_b&RQUyW־aSeUa塚:[T=kkEzF۳ۂ?x~"Ck nv-HUy't12O]+T' _99Z%94 Jk[:=OsL$PUJk-gw?24lbOu34>P|>E+&[߿a~L'4Z1a{0J`{Z鮱G}e}ޙ{38?EuP~͟cAGLaհ70a=P= ؏q_ڃ `ؾ^ cSme=e=7Av? 4cɰ+C lF:ܬ"Ff?23k- tl-3㇆(s\sexsqxsdOSi{1ϜmG04j ]g80Zax k]6m5n=6nD  WGQltrr}]y!ku$t*9צ#'Q3;94-4 f]'Y|_0=Zk~=[44r}X$^=Qt't7˚w:r!,ځLh0saYS,[aN8Qx0 ZeJ"`t֟%, _>YEXޗ:ߙH._:n /ňmYQB~~5{Q=aqR\)aH.tHƭ rqqBYRpt λ$F&̶&f|{t&uL&(1H]8:v#gnacu"9:^7R8#Z8: QT݅@vrhdz( FDRu,gqMݨ!,ޱlXBv7R vCB^P6}/^_KˎRXRCo5)3IZl+M,вBwAF"S϶G8hypHU$R~傄Н9Fe4k&" tZPsO &wd9ل0L;}MHj܌Y21pW5ZwA;sijJM+(W2a*r}\z5ohmpS)L9"[X-Z֛(࢜`02FXoCn>)7`n&="=9 Gˑ|x&JHӎ@Uhb&3S޿3WGH`wѕ/'Z'3iT ,+K-.q5ϊ`0`f mUՁŠ)_=)5[//> hFlϨ>s]Yja׿8( ԻT#~vDl[ww~XT-\gKKf$W;O[ }x2[4 }dc0-FˌS%I%)qZ0ʗh(`1uzN3~PVL\o.'G`Nq%B2v(L xr`'׃8*,횜G5} Vh+wHxȆPr=Zq"Տ_Do&,E+p.Wȯ&YJP]'~I:ʝ|2d;o`$b0e*ȮInbj1`=(W>k}. $ .{gW5:b³\*(v kGS-o 92G %ج=Y̸6#um3oO397X l*.U]Q8~FpYt 0^i6+VĈI7G~)o@ (%VS|tQ KE^ ,c.[}/߅ަV=^u} d}ˍA`Iр a[{Vշ٣omѷޞ_Cڂ9'j{=LԄM9@𿱏d[eׅY{M ٨7z 2JAo;Uoe]leRKtYɧ+'-ȷmH d0g/GFS<,v>#(`v]4yц$+By ʶm/({lGA65)ۦMʞ٪lfe[znղ4svݦuUi-z֎Z[j jo!wBP*(BPy[ l!uvBP*BPy[ n!uwBPoUPo A'[o/ҏHr-]x`ZJ}ˤT}>oxr[d_'.8yNcQ= S8'oPY yNr+p}G2ǤA^cPWBNa/hCE#YM8Xӱ 0bd0˳h5d<}OHbyRt ժj} ]-Uǩnj)fIT&3GGDr 㭨Npi/:ffE|dߓhN!(C򲈢{}%@T_(D0qzhr=1g2.®kRYiY`s^1/+o25zqFC x2Shi87,6bI!3=yd|eWrZ/󫽇UT(kRzX;B<逎uaU}Q󟿧wg7Bx媩x[KW,*{Ȍ^QYðd}0+xytXCuF@GZ @s]jdi0:*Ï2BG!rȓxKC,LkLF\_m]Q 7VcqЏp左zC  m;|=H$*f2=DoUVMiykAj|KtTiDh)( +-"Ճ 'n7ac0F6Qi7I8ӃhÜl7W`%@X>l{qԾ$ACOqOJ۫wiR^}KP,*N\UF2C^ŅTzјi<2VN<>j H'Af,ǰcUB >oN]&eZ;^*7Z7㺮Иf?p>^.-AH\|>IXʼW&!X$˟ZِюyBWP}X Ia9N*^B4̜,t*Ԙeڔ}Ma>Eu]Ioǒ+9}1`f\Jl&)OdeEE,QTTnj^Y*˕ܓ~gq߲P78~ylstsɧsiyq|~>Iux|J؈&j_S~3uQJbpNm?}o=>unttFYM6&uzjҿܜMxt[k&ߩΏxMۣD#q>a(58-ז_1JM7e4j "?Ϸ$|ޯ4? >5mw+C\R?qՁS4vV^xfqi?8u.~w{!.NQӓ?sO߸p۽ڳ~;dj沕Yi3TfIW 76hČ^?c$|`ўw N#sF|aUx][ ^W&JZ\80( |(pKF":%s=eIq޽`\PXM9QiM,XK){tw5\C''Ҍ2wDniB4Vn&-@b$]GkG~ԇۭh{GP[6N` u`B.K%F̩j%.% q.*ӚGS`-b g1C,>n\_i?ؾ\M1cb:[ÒŠˆjZH:'MCp,V&!S: x 4Hb}};5֤(Q|rQ፥,Lj B$N1Vҙ!a]ʃUV:'!:>pXJfI{fja E,XlB4"F ;V~٤OsC!,De{#O,$ok`Iu)0F[i , |06JhUEbuȬj0D FRWEۯk(P |yLEMf( nsnꑱBҕSm$8J50' qL)@A/C,IBl%KG+ϫՆG5eXTj *5ݎ7=k^ʀsk+nxYzvZ#FҐƏTlL 5tp%unB}$v\"f]Bx|~+ t |lAD(,WaMn9[dȁ^4Gc& j!_/a5t!fY"W! #4ش5<Ԡs h[*81p7vFTzQ U3$8({}tXw{)>$[=Y& cIR'鋒cd%μHTt!1X~c1n &q`6B_^^PVM+ Q:=>v\A׹eKQs:bO[2UTvp4Â׮/hVfs gh0 -܌ʏ6إp<}U)!P= jkXO41EE/8JW8_8ja?`yOj% F4p m4#"[u_@*6ιsSt9:V2egl+ÛA881Re}F.bMT< p5F=ACд T06愶u0bΞӾB@f_k8UHa;G+V$)X*y鞮Ƙ$xhfI$roV(QJ;D:$*J~k=JCJʖۏ<8 j_O8Ic##Q>E~QuI^S7yoZ—#ѪEOf&0gl(cRpO!甉69}V5 ..pւo`9X;\΄EM0r/ßIӋL:=[< bޟA I7Q%Q{W,'&ZY# Ǘd<'w#ڦK`6Fhgl1-!\L}|a~tsWhA墜KTxH0@j[(@݂=twT`TL{SdnSr%o+Ma0Da n^-tJӕ.1am;jf.Nu,T!Zc1bTY_=8Nvண͐A.+pgT^> / ǣ>@1"9bb!ΎmE.o .n$zOމku16T!Èa{2 LBυpsO4X_sƸ/yuWF͹<`^¿g=,<"s0SNp_l\(`k͢OTטYyrCM=Lcx b 91!'&C0xP's'cu~\Ƶ­)iH^]O$8*dTfR84 .K}Oatw}ߘuʎK-1y0ĶIOw+9pd yX OIU (2zɗ/$yԢs Q9ط[8b_[#2-%/+Osy OEK6N`z\)&)~BK$?ܘD暹8b{McF/^odjg& L<7n&ɚcjQcKs\SX$1A&fHp\KIg+qGsΡjզ ӮygXjG<YFHs:.W"JT\-Ft`Z nnIp~Q_Ө̻*J%͹QHTTwl n0\Ҽ2H~}#qprz=q-e\XSτ]{$8e=1A~ZEO 0V?IҾCԠCsvǦ2[Ӄ9\5$[$ݏ!{mZ8#R'bƶj(Xt*O=(+2";o^IRF.kx: 0r?6u1Oi칳8M$ɼCqFXU2xaͲDc5H e%]uHp_>oyK;|6 EEch` aYjJbD_ar R[wσL NN:7sHta{{4VBʗ8VyS]Y؄! Ǜf nY R'/CEeCsy!IJѰtŁt&Qfj+Թ&X(`iW(hcb"Iv3<]/sѩxdu!P±<IWpO?Zj.6!>ƛs5w-?Ig۸2Z!($7h*KZ=3F~Dli"4C~<9!kVaw6;Zȡo|^>dCx#}SYe`?ZB"|2zej2я8/ iC <EG_]X,fNԷW'VҪ%Ym£|qjE7 Ѫ~]so_S?-"[c4kW=YC⚳ !zT 3,1n]$}.V" yv%g$xT mcv[ْ:Xz7]9c@sTza4Qn(@1Gne| |lÓӏDDIuD+4!+ M,res> bE$+T}^g8OfQtw<ͷI95*>}N~` qlg#>zraqQO8B߀1^L ]9AA`Pgs;ɨcD6AG_^̚8z\3Gk bߍMosY3`?~2s{~`œ%\%Zfݵ=s^O }4C?ߢUWe5hZxEւ bQL+!*Wk@W0.f}H=g׃kOQpSslDm{Ԯ&&@ُՎCc^{팏]ɁBn`}ȇ xN{`O=#&\S?fĞN{V^ZPNٰ{0pt#06ʛ^P=m+J{h 8PtSeו&*\̲TIo՝o;pL=|s ?fzƎ6(. V<a\*#)v&WRC`Ң|ch[Q+h[y `ZьuujI({17ʤ*;+&Ň"^*&Z}:#thw>^ .U#_ gBZ,~K N0D=.Tf.>1e_dZ@K?/Qmy14Y#QSiY⢘pٍ9U51Wt!$".lw#?6o8#d WփG҈[ڽM/u9?Ǔӟ_ݠUfZ}wxŕLO^Oچ-A~eu3uvn|DRe.tb&]^h(h<V\+ofLݾzn Ml;s He Koi}W)YP{KNfov1ӿ7xyIo>EL Sݧk]tS8R=L}.|o|&bя1o@FK~}SzNfX&3྘=*F+vM/޾~9- h?R9O46X>=EԀ]G}zGWhu=溙Rxuzǫ3uՙ DQ}ݫ3wL\}|3 S7}]!GQX{Wg\tkTh _RcbŲكԘQNJ.K~f{QtfY!&{t|q6 "L34ǧoBbqUɃZRWQZ晴ݯIh4k?%et# _N~y~Br|+H×'Qad& VKPk|k2*"M?V:9zσ8)N;3l6UhX\X)x`@SG{mR])BHSI32!w,<#_^V6ꤷ{3_n¦-ҵs'1,ɜ!L@/ɱj d<,e:Wxm+bgk/f3֤ER1`D<.k%0n<|.O:C6axLijlZiP+-Ƅ9.%?+_j<}][--U711ߍi SΉ#3D+" 1Jwr_lC߈Adq`'  CP!)fcԀidQ9EB:.rD+}ACs:4FsIPLmX`)v44W :nVA *w2|WkCl5g{'CnHP( vH i3>&;r Ey3m7H`ʪp3յ1|\w>rI+xG.`@XE`~˜S%|qÈCI(aBy\smTFH4qDw̴wdͽdH1"F* :(X4lv= vT\2MDX? NJ+ݟé*e6Z JsGw-mƽ@ Ҿs&9iЦ`u#K(9M;KJ2M)i 7sQRZjhȣo#}>لe+` :,1֤X)B AY݉X+,eG>qOHQA"u;ŭe z\+*5S Rl`ڤ)NR+Zv|dd6'GӦg ~IO VpMj1JȄ5!iM;&s 6~mD.a?Bˉ&OU~R^pR|U] qi,sB.)Wp n9+Xѹ+Tvk>KFb/QsD({g~Rje2OL"B#H!Ip * >ߚQK + D"fhWH%H5/%$, k}L_H_ &_ȥ+5cRu,8:A*n{6;ckk=S,"MhXzO`#U7N@g~ƩYo͙l.v"D]4(k[QgȤËyi|mcǜa"G3,WTg^I@3Tė>u.&\j>ICk|f Z :;N|SGIM$ҬՓL|4 R1*AAHDpS>`$[sֳ?g(P~ňDrd O%Ba|M L`a )|4suDz?]ESKU'VJi] ɬi)BŤeWQޟb8>aG1rc<%ȃxq#CB*u &IM$-ꄫuQ9DhhCA¢R3"hbh.Db4iFfJzm$`$!%i@K71UJCJ;h,c-JtJnB&I#%zXgO*ɀR VPэ 4(~ 0]ޚ.AdzaGh_y1Sw3\ >ۆi0L֋wKsVQOe:z2B yfԼdr܇$ >M"hr&M WLH% uf=_t:JM2 ()qcw=PsltC5I5`MGNJ& MTbzu*CaP,ue >aDۚ0=/sd-KœD1JWTbOᇖԃ VxO$E l}aDXLHMbM kڌ\q]3j4+i7Ҡ/alHXq;~ 30Vj1ۡդ H8]M}F ur3rIdžÚu&"({vr7HEQ9LquM}ZB"NY6!#"zccxMj!N?-XaZud8)4^ Rבf׌*}ǀ(ԓy\elatdI0o `za y9 dj21@Y}- 7V?+q_>P94`%$ Xw/jR+cs`&en0CR1U3d}A(jlF&g= rݽ_Ԉ`q,;/݌\bāZi:y<й&r $;k@-L Zҡ#JZ2JS&>$aȰs53 3k%a;0(1[.׏{Kusck` FBJuUFN~7nyNF;7hmT}}䜍" oG9<@u?%tgX4No(,CVa`nW_oվhE -W=/sP/՚hBC=~)_"%mŚ PCaks[%0Y3>+- / ޮlz6ok)m,TZ޾n7ym/.~Y\6~K ̒NDP daV Š֟},eڧ>FN.e:V$m߬V#5o߾Y=.Bm>[2Dw㩻.,7\pe=E^E j6Q9?E g둙Φg"pawس '[mͽO}\ x^._ZD(qTE_SPҾR LY*mY\8$;j#[FR07*M95*A[02L! `z1Bqxuqo\3? +|‘68F 1G[!JrS6c|j}K.kk7=JѸ*k&a105U?ٰg\_Jy"AJi5awƻǤb1f iaCbBNRQU@~% Y,yF>`ًJCl1vca:PeZQ $C85Df1@xyxj|;pll>,¢Vpg&S~u;bfe*9H&~6R\bR pŇTwYqX/7]1㞖$FzefZUE,AD|Olۆ!)31|ZU vזҶaFj~=-3H}~^N耐#rh@e<e`CWwQq|sV`<|.,^I+=P >$QႍVwl?^]ߍ ,iZxD?1* ꅢ}+G!ejxv:;pAEA6Œ =VomeoD0SSJ-bm\I%ȞVY k'9IēQb-9R\p oet)%& \҂ MsqBu=:` |ĸm0ࣸ,E:NxA'9I,#*Ky |_-ܣYAߣw $ !H(@y8,5;_cq㙸{vָYZ+z^xv ~su~l'?l8'r] 6<RJR9ϽSY&{`ŶjLU+y G4[iAF?e 1."vSR~Fɼh&;nyVDmw_0_(J2ۭTq╉/|~-ߵ__kЂ.$rͦn$xaYj_Fꑼw_1wdhb۹f ifXX]˿ ?|ƽS6lXo^Nsh%]yzv貵St}ltI8$A؜,4S9X7ķl'2lj121%Ni=46 uT y$* 9aN_aV>"0]\"<]N&/yv?O|4|͓\n| 'KNa{p英uq:4-pK)l\H4+h0# kYi*/ilWwn\Q\lWlYÅ7fqid',afYͧ-~N cg[svrolN/,p\rz߬mD3HtMz2X1aģ,|lS/{S(X!b JL,0"~3:ZCAaw\18TX)r؅:q\EkeH*ڊiĨ0D* W‘hD1q]Tނ<tIax,d +]Ϲ~'G=D*\JWBvv>g`)qqzG= :5 eDHCOy9,б%E"1!Q&I'Q20`)^8OISc8crv@P!Pc F7`..lۃB>B0P21Lϊ؍#ßp> +e{f u>s֢{#AkFJ?Ňpr -ea%{󯿢VgZ3Tj..>T,_ -`-kPӅ'у.xb$[v5[Pw;7S%#UJ0JSU&j-/"D%i(.3p;K}ocEzrɚc8VIoInSRD` #fG*Q9 k"'ϵ5eM$ya4F5gӃ (332F楤-Ry"}މ}l a]X"Zc#5̽~kXiHӲA q,z|nV JPe!G&Tϩ_6M&׋M\hs6ŢyjjﶫğMOe|m'X5OP4g()[zaoMX;|k%k'Bݫqd'ͧ\e/Vex"6/Gkw;_רXEf_^/C/NJ/ƮޗOn"#ދ:,u_-ʗ_ /e+9Ko]#MM;)9,5 vuE^?<^HrZ`o$wS ~A4{'Äazq/^"_&޸Nj4Q_xGg#6 xw2n/kSz։hy2 "_4MOWŔ9ɢlU8~6٫Gx$[ϐ >?^`$./ۼ}}оW-^g϶;0tS?Q\(Mm˦eeh]ѫ򑝎K8i#Gܐd\8pq@.(@'c+BZ 8GeN@aJʨ *ϧ)Fux8-aezA T^:N{L]wZٿ;1u( c` S8#QH…(< Maٓ9gכ}{7|;SD&\")>8W+NwgO v:߻_xdIhe4Rξ:3$}tJ8L!TK\}1Oujx7ץ(h Ʌ[.ZcfS*3C](Ù|u CGX .133C̐?_ 3C̐Gрf 3C̐?3 3C̐ypac>x.DbpS 7۷U0gUo} fszVL}q)HL?Sϔ3e#Ŋ 1Sψm)g6Sϔ3ekÉj쟑L?Sϔ3eL?SS{)b?+Qr})g)3Dc:Sϔ3eL?Sϔ3e̋k1Ep."Xe9N()8c $Cjߩ9tKyCլxjf:pPfiu!#rI$w1*F%>P [ :ăY "de ˀ:ة<ć|: UP\0&1q[ .j @mTSGKyCU.ƳMC)!o)+)0(DXy /VUPM;l6O|nMmn< ""SX)6tPN9%[[h9ݯR;$Rn02DŔ9tXN9E[ V H{NI ! a$n2NG29ڀrek^F+8ӈ8 tʬ{uTS<*qQ"ց .R$u$vO- C Pܢvϩ<7|L&Te[ )+s1Sscj7fjY⃕m4uU3a1y8ߣ]zؼp[@9In|}5PO!!Z|w'b=ݡ_KJž}*u]A#mx{6mWogA+0-P4.6?RRX cm;IYC^;B1"VF8d|( jSՑ0KFftć\}~o^ $Tw嵿ٳO/2C`b(b9 XH4]'Q eU*hs]BJwG`ClPUS16Z'0t skyp8?ލ yό%̍&]?eXE ,FlIH:w$Lk$ %EoVؠ G$jr+PX"N8zV"^Zv9Pp1eO3vN{bi>Y]^_2ɧvh(t` "-E t)JŜmH%>p~p<7q ,,{8_N1"#دY0HpE!PW##Ƨ+98btj:7uuDۦx;cu=3h=ć\%+t*WD252FQzKvSyN-U:]zdfw1a LN2:beL}I)ABNgT*ۈG&)KCY~ )K8v*}+jK#71ZPUSu%zd e kaJΐQ +yFs# ,*ov 믤kw)]΍W"^rJ<fc-AY.pkb)Yd#wth=2곢я63д M'd }8>s!5/],;h=K|vgJH 튧g^o;zGͩQvԃW#BMZk!S.2 P:@ 8`4 ƶ[%>t֞t SHO!Zf^S"HC9E[]M䙖y½ڛ)KxK1zyHl(ł ĕiZ8*4&mrmo8K|\׽&y q[DI[PG;Ry@2k^^$A/̍F$$Ҙtf, =q~sʶ'>T/s:OvޥdO' T"ϷH}P"a.qXpSȹ9'`Eǿn3hYC'ш;$;TZ#'ŝJ Թ<ޭNWh/ΎL9G 'c!5GN٫6ĚQGf; HH219AXN8d8L@]a'DBW%!7ƼzkGҍE^m 5a`(G.zmo{>]m*S-e/QCz%UM?O/LtX"r !0H$ĢzKF,sA9^'ۘ0Rm Y,\+0 oO\;wG-ts~O|ww߫kΆƔw6=޶9%abBuaAx&-7n@Fm7 K|وGokoYA63`܌x}sZo{Pe)AV:)Jnz;ژpb☝14sE TNCKP\yد?ޥRS#Oy|zmJ~ RTj+(VҠXm4QɌ:Q,*zNvZ>*ݽzrs>n|%R2@xrbjly S;-\;<^o*1"<>Ԅ)J0s^oOLZ`p Vֹ1 vk $y"%C#j3kƟA~i}ø ;@'8]D"yCaM}uU?W[QshU:aT *(1*9=dHIU¨Xw8-9EWj96Gygcf*l)ͩAԪ1܌rmQك#]Cw.su1[]m{voʬ>DO@Q0셍ug>6r3޾]nzLŏ13X]hF,c9{*''axqkzg [PHj3{Ѧ<_Ȉhc¬V2r}7Qa/ޥڛ#KxKSmw4o> +SM2E}/+l=9}i7KVFο*|Q7fEkh2w-͍0qa[{ȴ'$ӏ_K-ِQ$ | )2`Ӄ:{xm=cG؞ֲ.Wd,I {k/qS&&_O2{k0qcTekx"<߁ӊyQhT4VHn, T^:Έw)dl8V]-CKbӭCC/59${Xᨽ)OK ![Z1]Oa+ n3&[c1ݹ0 C|s5#i hHkxZLZIA{=!.4h&„}ñ 0%~{so=b<x=6>Վ_䆧ʩTFgbڽ ȗXcErDxz<ā:4X+d./65W f1{fʂf.cǓ H}r:G C]iη+*bԚ8(-Ux^1Ⓞ+yz)?-ZE|?fu 'm' HƚbM1v]ʺ&E>C[H}:JCqXTljLoA>{Չ~3ND  .!C\,\ XN@ #!c4V3,2 壟\_Q& ЄVθ>7/iJ\a 8wt٭x95!$/Qt{T +AI-]a_9/,#sNќ׏O#u8txkeKz|Ԛħ$-.I CUOnTݽn8G$-_nжbu=ef,؁@JLjmxJ]O1|rc {(P{#G49T+Lٵ?e/|V. V[QL <պO.QrU0o"]Afj/e$B{'kR\̓NE o(yJ5YA apvTR"I%xsMB^!Fg1_Z0/ڢGXǢ6c9mTN~zC[HQixXeq'*8Ddoj+YY3d翿mH`Әi1Qģ7C~%Dro,KH}Zycq+,M ?J=F9'3Ugs/A0ω6 / W{Pjg3eLB]_<:;4VE6%F%Q0+Ng4&W Na|bBid2e>V͛n.הZ81=n0 `(R 3~,wMT~guMc=3+M`Lq# sX j~kϭKuL}}->M>P*a 105N&xS fJ[iyYLҺN<û?%y .Mkd-a%cuy_^%AXeS9 u'%~މ.Ȳ]_Q6㻻t1IaC޻DrT\ Q .@kچ-&/WL ƃ6Xg9}v@uD)s><6s>hUpܣ:A<9c{:0#{|r9x}?*Ak8k qsuk}:'k ++h؊.jr'Feφ.K2 B_w-!*/sDF r}vt]kbob [8"T{(DGvR51ĸ T 0zG>wl:7!h=aK|4e;*coDi]`x|溰f9{U]cM~24Y5%7hu}M@}E>Sd{ܫ8P+ 1ak\2ptNw{r3EuQ`Qf].2TdͣWI%ch(hkogߐRZo։!οr֖7B"/=U伙ñ s-thj}.4f8y79?ߎokԧ+T C< NcEdTi>?P/|!yHwxƪL+oVx2 {Z+9T1E ՖH q V3^E"W[,>~|_aZ$߮[S% hS2o^ * 37blE>OłNHb_ND[ ̖G5s.Ƶ/2bhϣv|njG>W͒;JJƝ+"M |2`ߤz|k¦F k֛=y{n͍B 9.q:K;2 e&`J.8\-MRgUp!+¥#V`5c G%cg-xaRa\vrOn?w{1KGG$&B\[dC#-}%e 5JLC`'VI%p-yffKIQW#2 TKPĸԀLa5 ~LAZMFdG߽CUYM {6 Iǘ=ة񕎨3?*t14媁.3&AQ2֠ cŕ'S*|RCh=;g"ry_oqvK":0N҆)>0E>W }r&Y'rԃJM#$wT/0yZnwDQmFd6GFS98R&о2h/<(yIpP1~nrxU8D~DH 17 (r]fM<ûo<7Z_ ְObfE||Z3f?8@Oi]F;xtb#ӱ$,{ k\O:iE6diӐy"D6vZ˒xlZ`0$ bUU-pK jVE>{ "]%VvOO?"\<3P˷U?_O|Tq䈌 #5|a eR} VoH?v'4?Y^o"k]wϺc*Zr:,( "~)aG$#z%e: A}E>W^ NuJ ߀*-`{={Lo1@/h颎Ϯo ֯86’8\l["k^yf h_"pi*Ph_qy]VA6_K @\("L+Q$A2Q.?\.r~w(:AޑTTĨ5AK$W `J>דf1{őլ)Y*ܔK4;TR惨k"–d|Z8x]:^ɘAmE>W"N(>! I6"VbrO>O~8J8O[ !,m"xֶhF#ei{)k)8IFgQ&cڽw6!WGP.k /yp!]w-1s *u?%TCa6%mW^.~0{Ce~}PvAQp,f[0T9$޿Fh ;$C,h 2\T`~&ODnA1pY*RJ:pd\p.b]/CiFC,TH Q`%a i?Aϵ4%U۠: T0CUH^j3%Ctqɼ` )+qR\\,A.*< KTTyG=aGr1{őU+K(ب!e˗!M5-D#2J>0E}ϨIC3f``DUnN4Ч7Gd`/Y$r*l3'[7؉پO j .xnF}c3ƸdDF ETf~>؍}^ apHB?dD@BAշ0%=φ =3ZЅ^ - >ߗ[3b~R~{G!(լy^nyՠq;#}[A=~?vO M%8gD&!#AGj<7`[k3(6ΕakTQ:(10@LVe +|ɒg)TT"b8\69irtaL%0cvA?_VsiR[p0 3UKʼ*@>m"QrV`T })/{Ϩ8{Y.76A$ |c`2;0כE]@n0RQ[B@EգyI3osClܥҜ Y:}(VuGVՁ#0qx67 &Ci6Ẏd:և$0үGU G%=ldeZU= s,3{-2U JUI҅$0y!Wt=.Nsꄮ`z?/-4uO̴G } |^I@uLG:pkrTsj-J нY&SU"fR\ݶzߓ 'Lڣi L~~xHI^k3.c0hQ-beDEvpW1Gy_1Tl$alI>qey佮o#MZ'lز Ŷ ߖOC^gG,k^bmyFr18)9ztN. ֜ì s;縵a~SKe0+nxzWL%ٳu5q`tՍ#FU2X9)W yt%X]|Ӆv#o !qTZW }+kKpuJLL4鱋k3Cs|;h18޼@[L'd>JF1]G~aG--5kTLѨF寅hTFc. {E}jpFhQ4"TGk]06Pkbr|ڠZ=[.-Aۙ<`,GmQ1F**JjAqaHLͭvkn/E5w/Ev]7cz 4.f2Ղ zd=i7V#=(`ɷ(Hy(EtHݸo,ѕmQ-" uj3O{mui:v0F| Eeץ+?11cIeczc `ff-P;х/2@E5տs)rsj>-+:_YʸRKr*9hb]ڊJE Tγ?43D&O E$3 =fcvLgb߱H׺W)bJ>}DRtTP?eZSi3q[5đ4DGÐ旬&yS7׈CLK*CL+Ak' L*1yhG0L`G`qmӳ)A¢ޅ{{eiz( >nFWo*\RpYȼ0+"F$#Q@i2,0xviaz[2cL2;F,QO0+C1PҌnޞ77YOC1>]lUJsm`)JG1(f(%& ӕ{Ō[O:jYQ8wQ;p%bkx=I\8\ 1}}v\&{1I93&`P {&B‰ɨ'8G C%ZYPRHLRZH[gq&ћsFDʼ*`kBXRrV*UE$T,DJWc Ybq>(a>a1MIrDBiuU`j @@k&-q}i  b:c$ܧJi4McMZ"H(hzLG8:TTH̖}vR<*'z I+8P=LL9wD!.bd|Z{enN.bK%[3`U\ ABnjv:f nV/i~p`Jh>ޛ$&i)xk D:p%F+g}/%Ug&#pd2"Z]P6|}eO=FbQ:p&܋œ qz`d 7Jb;p&Ru51G`Q Pa`v:WYs(t cXfTO]Լm9tALl6´%aq8 k[!yJ3*4O3{no%OXDRTùѿ}eO*,z'×N݅F̻̅wnp;,??R2j9Q=*ҘQn{oNc/Dl|޿>[ogazmQAկ{ux= SYiFBZ[MNocTZ~]"ߥRc[.ƼQYdG|~ŏi7Sv+6ƐKFc֑\8(Eę (zTuv-ttzzcq;qUlǞ_. .'AŜD9~ (kf^&~x"s,|<S|q]Upe)tN2Nل䓙]woݗDݞ`.K<ה%[Ur%uv2i I&U<-ΒdR'+c]???~^pO$#,l]ZUW]~N~ӲܞsGyw?<I& o7R։z,`H,y65q|B7)Q%Mno deZpg|ylhDxӉveң4&G`Y̻ *p҅I?/>dH2.R8NZ ?M|jbpݓXqJZQpS",InsNha9(N,pbb:?o 0hrw~Ow1|mlFzCz;h2w%ܰ/숛SlS.6E&0SEϼ,Y4; WiDloȐwsK*iTFL&u& aۄcf9g7ib>m]ư76ż0/LGu4^+͓./\L)mD2膗L;^b/Խ=!i:LEa:]DbQ]MRy,SZ04g1sZ^9{&T[CQ8g:pm{t`j 0#g#{ԌwG^"`K-էG` $n2;SxNI+hy?$0 m!K @7!5eQg7FG pc}6иosN@ṝF4bZ?']▓r̼\UUdۺ&Yl..^3H ӻYxz>OR}c21+%"#elK.Na/~`'l_ .y!ij JJ =ioGe #}0vN@ 6^,b!k$(P }gxb9Cye飺t*Cei#|n>ogE0fTP]MqIS;L*Dw1΋kDS;G O񵺼_}bv*}=1`Hu*QϼAg &.+=n;ܺ9`f|R4:i+yn*HL`TeW76/e%5ޫwlLۿb'_v)Ӑ={;5; ٦@$w<ުTP77Q9:\>b$¥{{WoTX ]>]f C _IpuӀL̄@!õh:?f"MX< ENr0r.uce(Z߁듶`d֚3n)k/9v $g\&;Cn&|}p)]`pmd v璟ԍO#JU>HLV";bN$1近i76LDS30&=Y .I]RjsfȵEN֦0&{6g?)]~ GZn-dOTsqsaO\R=c| ڵD+,W2zo&T[BވoXeoVކL e =D*ȫl2M}0~;<8*0Hh%BY}H1+㩮]|~cP  :bkCҒN7IJˁ)oёe O#6c&i+}$Z)n<=}cn9=BK%V ͜" t>P6Rgm~[9]H3Fi= J+E d¢fm8}̩͝9nTEYM-%y@1*enSyV81pn8ϏP%`r͋1l2bsTFph?Ij:a,TtrA,P ς3Ή);g\)r6쬈!qM:z$%3F!xi@Zl{8Ĺ1DIWGS8̙ig.~+=mr{Ĵʱwc1& +G`v&]ıu?\;oߎ N/r e"#n#hj!5FIȖ+Yo-@VKKAR*3$f(3> IƤw ,_%r 0rHpr(?F~%+ h:); Fr4͙.7.rl-7=mchHb4sF4/CD29.3A@cp \*݀D[sHѺd~/cQh\hDkM%Pc&U=%r~RoGwt-c*F9!_ Tܥn1pp7y,۞1:9]W)fZVEڤxv vF5-Y(-O?Iy1pkIML,LB|]hP@EGEiTu]gqc UDH Zc( ]1-z [ϗש9+;(unonS*%c4Q.w.xDAHn5?!UvZm e=t *J-=L[0.5yvj=_1:H nۖ1WCPޖ?^tA?96ҽ`UYƙ], _[* 1+GG`&W՞mY뱾8zSg;y A19vm52ڙ`0. YXU8\sV :AN6B.HTz=ahX‰٪ ('@2A/brMcP3ya];)菼h`k +NFX/b i9V*4fl?kJ:HαB+ QR U(CK(Q-q" j2/݃Tar9C y ~PJRa1L 1'o PrAA @"\K؇Ϋ_8Fd|Z, |Iu,MD^g, ?,{WYE^oVڎ~v[]|OGU`Emth ~BQЏG}8P`rjW.|#jDЬ*wcZPTFF_ :ە/묲%Vym(Ck:{FCr܄i6zT;Saq3k]l P96.|@Ot\zF b\IO0FL]}Wʫ! v#|^fM8Ugmt݂BwWS9+]o"Ist2v{:0Er.oSwi2r l2 t>{6W ЈiC(XK ˤ֋OȾuJ"w+z?L "}!"FM*>[E^I,ʈ f%d4`(Yy.lM)!m 95 ހ }.ND/swwCg\A) /СOylz01pyo. :І5&0h+ ;a jd`g)OcPknyXeJLM-kd]~nJD= |mCM0PJŧMY0e)wt&ݷsx7=qMD!--! q?]SB <1l?bS1sj9YqCEX~@H!K((uL^1$RA%2t)DLV9FK,ʒTjoIky]w^н>6#G߭pV}ݚk߻ag5ܬvwMϧlj=DHH_]z ƁtW/95)x:M(0Ezĥ6 rYe9wpv^t]$xfӾk m^mp,|7qx7ƲGfj#3{1b9Eϳ˳!K{6 qMk`6GYrb >3"ը#72ˉlJ'54oyζnM nhW~:HsȁҕW^Ŀe=n@R-ꞑOPA ED'nDM>7w:өzncBtƎ[ ʓ'0"(-Dr4D2mf Yba-7yAfq'eEeҭt[)y{'i{Ϣ Ipu0YGG`odzy=4e.7FNO5J~Xetg ?w,~%6{?\n P'BfsGpp)BM>˖%u ?V+@+:A.m{s?4u0y/ĻW? bq$Mr98 Seg>II8(F!eYys '{^ zw>e)- Zf `##$_n5JBuʱc.X۔?.xEw)&kClQ|xS =R- S8RZn5XM+KԝSrdS`xd,.Kl|-C$R %ShX杊Vl =;kMo!pZf\2I=b-8G xR0]j y42Z#kn-z7douD(bS*(h m G7H ԔD "&U=v۷H < ) 'mp#6" x KnC3W4 9>$ ~8p7 <$RNZhA* ˹vZ8'F7H{Iy\Қ%N\J 'dei$.ZVRpVh(G x؛(S\Eig2i61emwW1 HȊDڴ7"ͷH<z<gyr/^ =k.J"Xsru,10~\B;\,5$}GH&R4~\B <". g)xz r2:5IØ7HZ/x)ϬZ-r,-z{Z+W^+p)'K3Z@%k/q$ηHq[5z\\f -jH}-$ok :΄(^h+4<9Z$t/doK&R,$>oQ5Q4[$tw7M%X* Ui6N3Z|Н1ώV yuS֓'#^ =Ke'gE I JZ('s(JE NNӿS-zWZkbKPqL $doZzA*֣)QE>2 7:Kˎyf yZ'T$MNZJ @HEBu@wVR$,j sYNa];A1G| ↤-sވ>"x]w>LYA:V[m)H( >foEPX')IB97X2pEBhdoEβDNk"E@VBa{"xݹ}HBz#Ԅ|uMPBet$DF6H^j҆VYT}%Ke$J4AّEB ?ߊ\l -hJT+X%GHND˸QZBL%XHmJ6 R kwϘ 8鮪YGf5HMͷH^rIJ1HlM̈SNQGe|.-b-*#C7P,h3@q"du-zRqpV<:3,7rr*ͺ-'=n/1Z3tE׏OWOI~/ۻCiTGh.}DRLnLՙ8zuUWRG7kO߀%  d8* 91b+`@[%sG? |:m95B?kc{gJsJfJ,;7 ψ eSӔ,<@eL"&kS 6xMs2Q@BϓE5 e=J;rA8Cݗ9zS_͠$`2>SZU_5F֨s6gHcCzN'4wkgD~a8~Df')ef{dRo|- ੀS?QA+'g !{I+ֳosH}6 9Jh&ɤwepIL6] "U9Wƽ7og%CC;üm5b_L*kjxz9+zQ49]}9Gg^C΂W\;[_!/# #=j !|<LO>WĈ65ךtj ݖ߭P8=cE$Zd&cP2ĵY nRfI D Јs!x EO.H٤,jJaBB%9Vs3J=6Rqpã z:(ϏmA7_t6VoQqvO,_(,Ewٜ nꊑE*F )E=@!+X-rF[Pb,R ŗ{rnTiRh@ԒZn ʫb"1e#uTFZ1P?E׆ƺb-SH蝎|刴Pl:T49V8En0MZ9T~5fg?-+es=N[ӏWחUvE%*61"]]o5w N3 w}5isXۉ\:Y̘4Us9s˫ Ώ]Jr{-] ?{Wȑd /clIy= c0^#OInI ""U$K"u8*f22"^X=V%UrբʭC$y?%u)y!i=*.mָdfח0y6w{Xlp";P& -,Iˤ֫`"o Q9ى0*cAޯƃR$Sn:Y.;x> ZC#%LV fk*Y}U鴹_'+qeM0⭙Ȧ˒i`Fq Ǚ")΁2\JiUbE &(zJ֮-W;&בN$Bshe<pKpaP!@p6+%"lY!<AN},ZԞ/h'뷸L`M9&Un';h;zĨ9\8%:'C`*Ù. t4t',{,&wkcODډ<ߤ=V7ghOࠆC}Ћk3]m0 HVW_[#69I 0 Œ>*qkAdC!w 8E :]{bT^qR֘JlaA28`g.*Oc.*5XJ1VٞqRoُ6 3BftL=&ݗh5ټgHD1%"&I[R fo~.eHst? w^ k_un~3wgߍ33vy%|Ό `b j ;t[;Z!z_:w08B5fЏݰ'_xʼn9]wՏ݇0uEdNkyHߖQ7dΤN]5\x& :oS/Z+LVB=kxZXU7y"𻖯9<~JNDREb;V=ZVIgM;,'\ZĞ(C8i\[z/p4Dze3B4n{ _,(gl$Mmi^^Y\ 9qlYOMÐh=Ε;c?ˏJa=J\KiMrP{ӿ-@1L p[m7YI3u$ܾ T]5 0no2|PF^j|}0YBNɆNz:Sh-jՆV?;DGn#.sT)ԗ*I}%aR=kKg%~cIL8aٓ 2AXfa ,enÞ>|y]fWeNe:0]7xH7o۠J;)i>7ADhݧ 1(3Y'eMi\}iqJH/w3־wvqn@>m;ѓtq`zmk4p%5{_`Zʎi5Ƣ{nǣ9.-}GE[RC&ѵ <ǑDI#8 nqNucV<к37L~Ze@? 2Aדɳ7=*˴~85YwgMot|7-CWLdù ":St8Vss%9=%y1OU?ei>[z$ ܢNkƺ^½۲rjKncPаzM#?|_]tWLmik]e㫓i!2i t2k'ݮw@4GYHe> 6el⚼t|`a>:e&y0w>E3WZc &((ڂMbaV?R+:ڬH#iKҹ+Szcpyl%S0=T"i4P%f!0cY@,hCI#?c't)1&uE+T4hiB$ Lc,-g Wws$'I/D&jgD A7VyP0<5s$e\u$:&P(7 Z*(hm4X> zvAgEbidD#(qk)Q o%s D%Vt DHD"#vs)59ګ=YoigϏt!Bf+9c$7VDZn-&f_bJ2X@6}" CN?f0sAp@,Z"["TgP;1s's7zO^=X ctZqb*mhHi1" ijsl8z10t4[N"ki)0m'A>,ܞ#6۴C0s3j(꩷z۶TL! Sl pp#Z䳶MC`gEzSV{ *8eN[ĝ€]]p8r8sֹS, #&fO=V` sx.ћgD',m /ͨ!x()g]Y:v#>n̮e׵Gj.L\ H!O2,P(13Ze?BaI*$S5'C:|+Abm*c!L]*&柆]F@Ǖ!lfn>,ތ?N={ܞ?(ӡ GҲ& o|<jqt;ip8t7s=ٚE4}}U ol~D=ltg%ZSIՉLci r~#|Wҥ%J+SB>1uIl5(%K5b촦3R+|?BrJk؜PtS>w> 5YUXZxs |0ū(LQRf4ܺ= $Se)ؐ\?VeOOLa˯("glvP 6 :@z^A'O&<Ǯ[W`I$?'+#ÉX3fAHz.e)suon÷\5#E 31PoIFp,JPce C>XgEy5],FSz㷄AiحصN':!<4b6VoѢLe,:uq?:*X0taE$ #V:`xn~&&VQR1qV{t"iZ+I~~h9J+(/!#F]}8E>W'踥ϟ'{1mulbPd=Ì7R XjLF 6_֪\jU>YF. 'y1Y4])k`8MUڗU/f g_fJn=*\ 2- `du` H 30)CZ69b{55)M"^*sV:Wʈ6Cjpf_jԙbکvh_'VDײJewfA_){ÅN\<'az lHKN1{zqff&hyQOL-s\}W&83?!(ֈHAi/?yX`K6K|yI,UV^j):ǥCrĨX$O4lPW~F'Iv0iX<{:D1rكuA-lq$HG)8%Je:jZŤaiWyI#[ΦY.O>걂WKP .;r sxr ItՋY.''g W(O.PNqtJkT^[=e[UZSgmp|K|Tۣv?gU'<O!ډoRB:-gWnA%vܜ$S'/ӞeM/(;fYVT+2LTP9vCoBٖ9^+N)ĵdK} zPslF1?iPlm Nk[0Q/ pM1KOzs2k=uV)%sLVE-^YRxS\tʓb 7'߭yOFh(ylKƉ8m5AwZi3 JTY?,KI-8/&_>jɚtE9K21VÆˮZ#UR%js2/p{jaE.W݂ @r5w)RRn&x92J#"( -J)g-B"5w7}͢s+7j&-^l.Z~GE.{}cfc qx̞ZQ_UvRdP$ZԜg?TFpF:7Eu&gdC=hb#B5Փ'yweSݭ>v_u`gr5MUUjzAFEkMEu,^w%׌uDI9+ԦZBBmww! ]fU R:E{k_ =4|PzlQ@YWCqYRm|GI]_wJ$V֌0f,"p}105G Y.d9?kb>|3B90E/ŭ!\ %%c8 &*P `@ $0YfY]Le/D /3s&BI./ȭ72T$(trq0?~0鯓҄mҙ-sYTuE~GXyʦB.R&x<+hFӱ [&Y|)fWl׃-U恒R\q>.e92:CW K(A PWEoQ06H/6v.]bQC3, aĚFy[ɜK9JՐEC%B# bҦ=hRjDs$$W!H% MƖSJYI]NKuX@+Vr eg}rtim9dT?#eM&p& pOKFaFGiɠ9kݏ}˾p{Տ2@Ta7_d$&g3˝'Y,t$19R9a 9&UZ-U]8\|V3t"?Rz~˽Ci݋!㴇he?&&G=HT@px|g({]% ê]gc!u9e..:>T-s\lڙBxˈQʢ>"jn&*,jWx 2ǫgXk6 >j%.o"R0مifb׋mpro5i$ &uD9hkcN&zsj ;a|1i;Bk&j̜ܜ5\7i?$;jeCrMe`N׍|vz1&K=JgD]snlJa^iGjd~sVk/;,>c#3Z `*7t}% Y A>aM๤\eR »롉ma%xmCn7m=F➄h݃~Do[bQ/p7$.^)xֵ`]oȑWr 00;dr ,$Q'IiQ%ʒmvglG=~U]U*7>rgi180v!iA_+jA[mjV<"EsuD?n(zu$i>G"s!=[֓aVN*Y>Z*7i3$e9h%X`T9F$㑅H>HԔ1тFQ4`pH~2h,i@ ]PKG':vܶ:km/7x[@Y:;#q1`_-Xhm 4wF-nQ9b0XJBDL$UƼpB+m?Y_<{xP3/Tf=Kl7bq,V2̛_;"U.9d/a(}_E>rH47QEqLj< <(aQ恂V1B`!$D-֚>AmhrE&y$IJ;<ԝGg]v5vB^RzmY܃yZR !z< A][v%:\s*J@Wh>XnN:R")9(fE-1OJ%0S ʥ> mS*E-6INƥ+Ms!+iq0 fen(iU"rI>dLޗ0_S40 ?f%TL eUγ2(4ocx ^|n+8/ doR(⇔OC[WSB9+AI$ɳWV`%0EjSwe"|ÇbjU~6W AơT- -,IIw[ma4+ilR )?u)UE٢濭N* D-Z^v6թ+M#рd}MR]*9x]oVxdS4`FG;'9˫ v# Hbn:G!-ْ;:G< Ӣ9e<pKCV0(hJj#+' z"(ӣ6h- \v<"1Bt\sሗ(j y) l=}iS6{;>_ThT9+lL:El%(`e`X\w?ln;WgF)#P-BI$M3l;xvyR}T[NqOqLkN#q)OlcdQgZzd92Fq}iq3L,CYi0ouJ=9_Jf%a `WXv+~>*+^V)_JœgWm43Jr ?cm2JwP]m n4H{寯A$d:Bm?d/?~ݻt3tZH8+:j.wm;&+w;{m!M^r~ $g\Kr12)G>\G,]f?VNzipDH`l)4& ݸ\ JEw&a𵀉,ma)bb{]ޢE$򒲽Yeɯ\=]ך}ɬ*Tx8 wO̢'&]^? p^slΡu9:ZsTtfO5ݽe2yMݵz^6y}hWZTFrmhvY˰{:c_%OlzG4ӟ7mayt޲sU;']Y>z--p||QѲl=O$_~LiUݏ߿iui^u-&O{=sdxػ_BF-a .jˌ,GG-*S+1G/ND* Sy1Il,0Gd1]p#Kc=" #$$8XG!1z ",iP%LR#1sB.xisZi=^4):"\/~>$B26N1w!K)Ts3h9+|p|^<7O^G 'q=kh./>Xn<(v%o-dmet5l{ݥ_zznhc-pmPB "E)NMyF)1u+HXG"\JlB O]c*51-< 0R5c^l8-c5CK9Eyr <ŖD;!'%ZâZV}`^BA>Mu^]~N#2\*밂`,%V)T ZhL$xX2e`* #/FFc#%pV¤*PjRE'Ô؈eɞ.Ǧ3מអ2r7n>pa6zVr ݁gqT,Lfɝ]6> [69-r͙A6ƒELpwũRyIDd8u.gKՠcFqTDs+#Ȱ, Ƹ#aJX+"D4>={~ٔo)`SY~7F,![t?!)_k͠]҇_AUtM޴24n Kc} $ KOq:;']W]ݟAOݿ mmW=;УMie _рmP9 6ݫ'0"~ V?:~3ߣgvDo(%L0'Χ#p]9=&ͥ pA\ }p)ھu-̙)KZ%m[Z*U.hlEP; x9=h$lcd9=aEp薾h\~żNSF.T䶞kPKp!0^U=v¼sxwwx[W_O- [5k(1mv޾9}7MjbTgbw7q[W靷g}ĩn=V/ VH72Ng?ED;F*O*%`/-Ʉ$q=~?bQnsA78,8_*)dp\{Ώ8r`g-dTP. `QŌliS(\&qBᾳO[`)i}ees|Sfqz;sPqC8? ݜ0z^}^/E,.IW0kɨ@LPVsp?-m.?g,!yo(RYu a+vT"%C;f}2 , zSUk*.2 eP# &'i`,`XtK{Ml%)M(ccY{$LcK)Q$47mYKKI%W~sLE5(ժ$NZHX'VIAۚы ҕ,A;,8K`?Ǽs`Cȟ瘉m zAP5UC8qB;P2ڨI*+l?\5H)ANvR=WX臷 &LQ%S;鐊)ht[Fj˭Yu%# JIhqHF@'< 5w"%pJ%png#~`ǽN[9}A;Nn4>5`&w7yrTmgYlڇ}pusKӏ?P 1$dQ&.O  3_bm%Rz>* n8X}4<[A*a͌ߜ*g3ǻau_긬ޗԔ,$Rҷ: 8$/oKPS&׭ڙuh_. 4ߗ/k3<4]AP.&o^ luHgoRg+ܫn7 ՘_XP*L&=^M_гkΐELr4sk2m/柴w9)^f]p|s2Qh =hF4VI˗T<N*0jNÎFzkPL4G$9sޭno}\etۃ_{# v[cE ,&4P&'=J/+7s9 kc.*ډ9EhNͩOFiPV LF1dL!5y4&K=hAz4f{~it@wQ%馗m>4r_o.6z'ʹtqP;a !/s k Lh@h}w_+ĵR\Q87BĜ 'Q#pzr蔈FH/=Cȝ$11;R-NE@H{Z*'HřuPJ.v:[~ǨdF%V]R-8}KG|/'W94H  p3-&_Woڀl0v|hdEdxfj`9IK}f!!%dm  H U1_]w[.8s$;b,-Y }~]i8_]ĩD28X@XTڔJF={ Qzۀܰ&Y0+pipoPyΠҐ on4(SMM}<c!>>&7ɜ$v*ǂ|o\|1n7k*f/*i7aK?#&}uϑaVdY7=ߜ%KH vGR}L t-h~onl߶3{taYW'f)ޙ)vָ!U9<4ecIj`x/-$#], ,?wJ2qy'ٓ>}_4Pbʽ^>r->.NZԿ2DBDV*aZRtܪ.dCw% ^^ᢢ;|ْsB92 $:(sJ`12B!.>r˃c-zx`9Iwؾ, ?wi/*@+1dUR*@4B"#xSBlⰢ ):-`*)SeBK̔'#QdHbmCEg٢ {byc ΍A _*[2R19@FX.% V@ei=vLNh;StX"\Vmo}[FOxrZ)skV[AR&~ e3-iϧZK{4)Y{ǫWsȱW)$Um+ZPdGz$)$L}u'n$ +#x%#:ǵH;0Zff):n_r8K>N~x~Nz;M|c뺒Dw!̺/VyG~\_ZN04vJr ZQ !^Htp{ϝ ,c4$ŝ\BZH2CR> *ZG[%@1:IDU)c-\/ס&]xybg_3Π6^mW=;S4P^eTWٸrx{êMR+Kjvk߽ͬKKw#y  .\m=).zB)Xa 3F,zy#4]4$ynZjPlc}s"7y4nH=bQ} ?PxV0rՀ!&V̑Ss}]t\qFѹܷxI W$Ryަer+Ӟ@!('bOAISx4G#fk2ehzw韰B网ؗq:8TD3|Dk]=W%E)풊!2ZaY0Kd*d-yOƓ\IQ}Rˑ ;SV $#8/e}F]T$6F@ }dLM/.z ZQC`@)tr(fH2D~+؊W"XzF{{4rDU)(( 5A{T $DdqWr"l%ƞs;\F.M>|vJb>-\kNV ^{peVC^ޕ6r$ٿRЗw\Rއv5 xg·`f7E$}#(YЇb23+Q 0rȈo.FT8G"XP{TB@Ѵg=m:ҫ?^-ɐvjj}&篴a%ǚIwXA%(9Ā% 82X@TYRϯECNY3 8 I@z-X}-x@*FiR(]jJ0X)Cޠ[r'<иOa!yNpEjNJbUАr)2=bELjsl8z1m1G:$2q\TL.5Wo9kMwqӂKNXFCWIϨp=mB:5"CF2gm/#s2U?MD|Pʷ?;T)s"ꢅ $!& $bs!}P, #&fO=V`(ћcͽ;ٻcؽN&G١V؆qt춳H뉐7Og/4,E>, Ue6Eb#"uM#zp jaCEv0-jS,L"Jjac0vä=|M݂&ᩴLE˗)6R[r~qIh%< X;QE L@5hIWm=dz5B_98\>X":9;CO>_^fl Od2-"|NZ0y':|wwf͈( faj n4˜ZG=fwP8(NGJ&? ~;0n;e?/q .j/݉"M$%#v[V0`5'ƃw'JR<%鑵q)EZE ~:<I!nM1Z"p?zlG/T`[RinKMzN>F(lTw ش`Yބɸ(c14@"E=*׼4J2PQs˻:|YNU>  (P)fJK.6< k6)gS5 :'ȈgΈUH~z 9 V"e eC/3e2_fˬd/3eu-3egSBYrtTR+[sah`S'y%.x$2^A)H (se.̅2^\x /se.`C /se.̅2^\x /C &Jܞ,s{ܞ,'r{ܞ,# -Ɛu8@vfάÙM+gͬY2pfάÙu8g:YsY3pfάÙu8k[fάÙu8g:YsIbA6) %WyڕixVjT%xSZ8S`)&78rk  VL f:aC56Ě0iXnumz8eG;/j۳k͗dYr,lsHHG8Ԅ&<1l:$i/9XQ[FH(򥖜N)IfxIS-Ka짓!x{ੇNN[r4=俌^ؑW!q A\2y0Hc>%T2ObJ-7 Œ9ʼnuXTTLmzYKY&&VQR`*RG~jO,Lň9+j$>~NߖWڳ[>gK1|e5۶:x Jyy5G I)i%< ( 2;PGJt(xz[RG* Z<LG eZM]X$"﵌,6MVPƷmt6ԡϮ`KxJLE9OTa|y+[:xÀ08m)О_L[V23t㉻=.1[ke[I|>vF0 byO֟.w9tuC8;w,ʲ[#/?xvy0Lv|ww^s([J MnzGQJ4_^ڵxGY_ jsn7)gW&bP`}ԩXDLn3VAYYs<p(7c@euش[u77ty^_`q3k\9M^W:~tLlcm1O-K^mrK*Ljd<)V2""&ZH0<)c"ҽ6ђ`vW)@^^O? O-%ѮÝ/4VٰeYo_SsKҥLgύ3 <8nYղAJ:dk|$f^ISx\w,[qo*m84>IF_E_-}hv;umbN5O=%Nzȍ<>k 7^q?{ֵ[3Ռ߮$ΕGB_e^oN|ӱ#TԣyKt,%0a%gԤbW %LolK?@2h[ֵ{{Y/OH޵;7:_ŇKJ=܉ӘG"h@QAsglrD`p'%>DN=m :^wn;$Aّ2ɰeK K_9SU)%l/a(}{v&fgdwy5h{&&S-xx8Fk(x#$ FBBb9vr6 (WK ,`BL4c` Pʃp-YLpy7d_sCGopq\ӹ1!DCs#A#k@#_D tNEb YJ7+,yHFǤ_Sg8ZQ,8%yi$5j=zŏӅK߫ SUa4շ~W/>kYq`,S)]^sg&T"Z-mbkm:^i닢%(.F*^ GNI.qJ\ ,ɗD0 1X۬qEͧƅoaů*yVQ`N}gOZC`w$_p^ 7-_T'qpj';WS!9Aٛ)V0EtNRP;5yiyouiC(XK wOIWװC[#J%;ƋJ(X݌'ƃJ%UTWTM=?)ŋ 6+%"a RLFsO-NYi14zͻd>(HÏ/Q 0S)bD}i{3a98GK}ķQo2q#"B㡛OnK-b}w 㩇9䠛 /Dhm%G_%[Nw8jZYUgQkҶy9(; RƃqFK.2Bpgѹ;{?qxR LX ֟bC [!EuT]z#s|yeݚ|BhiI&5۽/xdź:J9'z780$5IdE9N]`DZʩR ˽)ׇ}wOzx JFHq 0t+8OE(x `̙kqYDliqR3- 7;ί9Kvf4| l*@ fQk %.P!`V0/2ZDJGt`ZD"rRal}EY1 3 r_}d#6H[)cY^i@ gZ{7_!fP@w] 7A?=Ȓ#ə[MR2--Ql@2jN*zHL@:łKs@g Huei=gbK\XZ;z3K逷?}etnM./'ƒ H|2(WZV I(׀&y}~<:e%FnQ=msV߫M% w }!mz6ZsAs/(1WL0䢅m<<7h&PIs[)%u09sH[G Vo{b.)n.'Q6-X;Szi5CM(KamvWJݮv1+.hzU*ۡg|Xv?Axnr$a5ia9Êtsw7f2V}s/`vD|\N}6f)Uv?b:_}(WE@JhKiSs9|?ZM:7>u}Jg˂\=0+u\>1zP<2(6Ř(d ҏJ5A[j״yh7 @#;mXBl>y*bWuM_yKA1$%V7]=5 lH7>GZTm'kbٖ:9\O1ojPUM-w*]\ZVe|;V5&Y)&8Ǒv\-Um֩]}ʾ[qbnހXԬ?q8x;2t`}<`JR<%c\ oV9ci6H:O>h LU1e2ӕϽ q1birE=ʙ<7J Dr(”p:-0&ڨb$Drۃ>qst{D!3v0nSnw|ɲr]ncm9b[6!X@C:M0S GF-k4]͙M, ;)~HIf-wG]GiKͲ%K_ē<{?Kfx,nlF1 /sk6I{2\Yj>)"!|WB9{B*/WMUo&3c} 7вt~yUZSd f1A \CX5Rxٵ-Eդ{1d> @b²XQ/5JF{=ҬC$9Ϯɬ/)3-VX۬qRͮ o`ٯx683J3=LOߚ6ڇ`1-^pesw (_R4Yh>(~jkS $,XnP\HA)n`fpiG7AzW ЈauiC(XR>c.*o.Z/o` C%J:e`a,ΗWB&2VQ(_u=;(,zRWɿ52@P[,MÚpA(1钁&#mTtY߯fS =;'(åoqq[64|HR໭S<0Cwu62xÇes9ʀxX/-4paP!`YaA=Z`;y;F]Ĩ9h.E#o02ŝN9h}F:*yo}|TNwOwn>6#w9X}5t:y%zq0bԹbR>\b_.b;8+F1s d*qxs}O.;9G|==+'48"$0 K f4Pvtr(0i;I[cY +Wq@#j MTz\TknbΚ"ʂtDpS}"K9Vq$]R>޽ 6{bͱgGOĎ/ RiիY &cը 䰊<@rFϵ,wJO0_ۇ$a?;dh3J{b!%'Ig >,Oё\`l졧w-ul1dˑ߮GXmMJcgs(ϰCyv:_I|w4ܼO>;NXlCij[x05O$iFѡ.,f*'j mwƭTOD$R{N*Uyϩ^jˮ:sY&&VQRp*ƌNqOsWL#,8ZI}ϳYWuksG9hV⿈Vz ŒR<;\e{R:K |2W^R4A\p@64lʵ!,5QPJ@Xz?~ϲua?4 ( E0/{SB(kB.0PZ\3r XR鄡hߝ.&k:B`v:~6˚}4ny߂NLrJ)hCcUaL X ݣbh/^W6/*~^75KG,7ޱ~m krk?wߛ;o×W̙9{vAJv67 1wMl7ٓӓo[4\ (#]wiE:8;5;ϊ;yMQo* 4ɒt gٿo:}@mlBo0%qVN0.>*r􋕮e.M]Qu**To ͧ0ˮ!$ϤuϤ7Nr-JIT>Ņ-f.^_+ Յ(S, G RD~I g诳jb]`T)&fyZҥk(SNhв%`̙NR6]-uk}~ȞoRfhg,h9Wk@PE#&hT'k>= lGOѧ|lQEaI"(I;R"3H"*rʩIA<ۡ_tyC|dJsVk WBv닛A Ȋi+U:V:^a٩٩Ɉ xa,Fh5R<0p*xϝʹ9QhBti&j Aiw빱a.̮L$x9``)˔T'rOU3xI[/dT x}jjF{rg^ 0>%SG>+7~w tD VhD( DNȩA9$2KS O`>fN;+ПSxHYqcQʣЦݧeTA҇*&hm@3CMP3@p&e6Gb% APk &rj7ؿ)пgR 5VX' E99q][My5`S>H-Dj3h譢p|HN <Т( ݝjCv$18UTbWxߜ)1D,%ۊoBܭoH~$׻Ϲɷϲ(&[ݙ*EG*Lw:ݭD1GKF/"98U3Ӗ .q@H7@)cPn"a' Bh$[ n%ˍZЗ&ŜuBM7S#IWwu'ѡXOsQ y @E=d-"B)eheHr)Y{DVIB!rH >p{Pg,`@烄hL"Eɠ`YcS^jhį(6:3 bU,T9T1m-AKkjmRTK'NE p AC2YlDg(#Ѻ +ۚV_2= .z)b>]ML,Ii'NE$8>I0\$\yڮ$YKC( rrGpEXfЛ+e޳^Ǩ.rb oYZ[NC=PZP߾jFtJP ['B%Yʒ'"f@[*pVf3?bF`Aoڀ(B,NIm,* j jII%N:]D scL>q՜:4ǜKQL$7MjyV 8N''oN'[PX7NUpoc2ލi)`hne2h*#n <8B@4hbr-7f%h[{\(VoR5K06o14P6Qp5E $$0&,Q1}p(#&hO=`lUY5 W;hA$8;Vo{[T㐪R;U|Nj N L&̀9^%6ëQЫ2kRy,Ѯŧ!m.]uH\kJ=!&j)Qß3aF'-uڽbZBuPhQ;}Bm]d_Iu GFIWqk+}7?XrH@x{$ koL.[덎E%`6o;-9ڕcϝmAZgBZQGP3bhcg0v~wipw(y5_Kuh^oǓ蘷y5*Lױu,4yԋ~21,cE]bw{z>vI6.] ՒAĉ.pGV[m/}..a?ǑO(\{KF /|4%(G M)] M)|4pe>GSh MѵGSh~ FiN H(#q0Ƒ8RG~#)e HAq0Ƒ8RG Ha)#q0Q.da)#q0Ƒ8RG Ha)= W[G H9&T Ƒ8R:ƑqYa)#q0Ƒ8RG H Ha)#q0m+#q0Ƒ8RG H)6iRl"z8RkƑ8RG Ha)#iiI Tiœ $x&a/LPNBekͬt 5-Z׋Z YQNׇW6x MxЋs8woR5]+-@ Lu>0U>U*TN8K09uTo*\^}Kz 8F J=>X0#1@9˝ "tJT@A!T nVu_2Om?Ot1fYO#+Yͤ۶̏wO-F f^]21ncԬuCwLl2?{p骷.Ds~M q7%juI[-?8jKnM7/g%?O Z-"LxqO~K |JσOaARdbs\8uf)䃀y_C{ߙjݣN; (*@3Zips)#4VKy&s@x[כK&Yj=CwetwڱUm)_߾3v}^qU1 ~jv#6|k34)0Ă^Ћm9Y"0s^Hf%Gx!M:=w:ZF mطZ͞?4܆h4 3K~T<0ǃ@$SBe80(3Y`jDzMĴ|rΖDӷ.f>ӻu $qۓϷ] 36?/Ϣ]C_ WoғUYeEo\qz|Q n/|?Fo<\6Gel^QΡϾ4<~E/O5.F3bnЫ(ʼn11Tt,: "{v+ݵԶa۽yQOEm۰qYgSV+wIVI X8^1ڴV5V;TW@lޘ1J͑0sˏw譎n"W:~b}l4znؠF *мNwV)7ɬ9Ǟ}hX;Ų.TZrymE_t{xKK͠qyp;v}ֲ_)V<7V \ Z%%b`1 lyqsel5;nVPٻͬtI֫}TZXm)gsOէ(treD)r6pxD"t igRi6r:᭶&HфxVR^rJ:ɗ4ӘiIT0hO/~ǺK:SF..m"i8^"0;u~ _bŰz>QizFww ӴSc箟Ӛ9r'L 4)"ZΟ,e4Z7z"yɊz9^.Tvw7؍$_nm3ο޿g֣~35.Ifp IpE5{9 k]Yoc7+B?`rl.@0IA0$,v;%xX^:d4b-_*$N_Nl>04g{1TR@Dh\r <:g_e\1Ŗ+Vh[U=;ŗiZ|Z:TQ 3Kζ~WRWog2]ojq yڃe2ZfZkB_1h15jT~YݓE}W h=qPg#ZDu=O_i`!4F+FAA%}#{[ fŪc-Z?5:zYRؤ%"G  4̗ɥ`&Jϝ>U(}g<[䲋aĜ`U#!gnD'^KR!E!51wHbu[KWblxIHFrLgmcc8CB:"?p}ސbQsnI 'R>}$]!o^n5=::: ĕcY8l̊3Cx**ڳks# 6uk؋f_l/l2IvK)өwzWr.›1eO]T9R@t-; QJv"k8|Wu;lMNHiOY&OeZ.#1f%4Ҫm]S!hQP`K"][az+$bBo$0Lec*lX2d),Z"b.&Ȗ\]H؝-^z{E2:`J :U a,Y4*p:: Ia.dlͦe/i% bKuVEt[<;R>K??ɾ3AO[W Oَkb?~zY>{F x%ooX!gϞH1'}\v^Op֡D!z$1KB-690F6%([]) ޒ`8rIocoܺ x\+jM 7L{n~d=(6UMx2=](== A8ӕABRm/5C̷߬k{{1aΌ>z?Wr4H٨0L ޮi{aiGtk00g>}Er<9ir0-`y3ϣ߯G9gf}5ݒ-■߹^^ ڄ'Zk l ~HܭB;v˗GduI_yƭF1Wzb0':z6 WwȬjcdZX|8Fqjn:m؄7O>ty5Mg'mVz{,*+iN@] WoL! I( $4L=TʫUcwjB6!r^+W0ROlrCvF&DWR h 4{ ^d'{VEƴ ^wـ> N.מ7lKozӗ ֫Wz*ɷ2rT Y&;+)gV}f9HK-QYt͟J=-/~3"Z 1cu SlH[Qɐ5sMkeTPL%]6jk*p` DUJц%d d؈Kd"C`O¸Q1̑*:BYFKBaόYiX CF*Q"ЅJ]vUBb.Um4l2sY\ UX4Ypu~3]@z)2%bfe0YjebfOY)QY:zľMއz̊vF0齟6'Saɸ( T9}hxoE-vX%@WrMBT#jEQ vF3 A6eTDf=#lzF!;:uVUd--Jf2&qΆ=ud|\iv8b~,.Mv,cM;?6)@Mcs%bod 6++ݿ"8;oh: kݵa2vb~Fh h:0mLD2z{1OtU>8P#*;5d؅+*RAYCq)L`P+OXR*@U1RimbQIB( mbLR"Iv< G9jt;v-=j*Lұύ]rnk)U/=<V!& {kq.j4FlZdQѱjTP`<:%v`W{Oq9؉f!d(K^(L.U@TvH⯌5`5$AFN)@Ѩ'Ůw5J=j֪l_ ł+eDE79ri5^YGF8B0r EE"i%7:KFAƺX{LryE'A@qT%j0F0kcr2袥)Vdt" X]Z(ju6ɌLC-*[s6%UX{"E"ݱu6嬕?^-Jo`T~%Gx#ZX"EcLP54D6@"PHy W"Е||-?W;?c褱JpJY:#;iBFC1fFUBg}ȩ!ұCΠV~R'-7)Cra,h,<5սr(*)dVEn^qt8tyuXE&GqxbqheNu͝; ogodެ??ގw18(Ed2Fe25֬3F̙ͶE`̎ >K[w;"< 55zz,@BF79P[{F R6eR=>z:uIBءۋ;ZIdWxA55w4\rj ?EԢLV 򻷍ϣU}{6*csSU_\ 0ۖc+ձƶGZBZm:GZgNSOtkL{YDq],"&8"5Ey@Ɯ"ֳ 9"u2AGFtQq/Db^y x}V 97[[ݻr 7mZRE`hs׵~zT*[{_ n_[7֛gr[:]mܲ͜[wvy|x{wy0ŒiB@:"n{AB(-=fEg4([ VEP&鳰Im/Z3[j2,؇ir~qhx_??eHs:ge{{{>=m2*0+^nRn,?^@n s~/gzX_cܑz^I 9A/ղb)T)ERSa-ᰧ2K_H^]=~^ #NA/_C/{=h#9(栘S̵z zu~d╕gbhЉ_dc zn]zKm //?F޶:8sV`NʭK<#~yaysw,w<~9cu{ٍ_{P}iO'UmXl{"s+e:qۓ)w'N[ Z}=} 3η;!nPtXq86J4ȍ~N( \*DTtT6m>n+*p4Je3k Ƶ j 5M'oN\)Tj$=zU!}a@Un|bIl@$,:htT i|z{ R'B稓LdS4ړ$LNhmi!5{t eEŠ˶)=y2Vfla̧5z'6.ξ,dڣM8c"1APRd|e Ƙ 3+m$#NjpBiJ1%"HC&DlBb+sJdu`*Il85Q/._{5t@޶Ibk{s\1Dlj,9䒲}w{1?uLNɲkG9LtlO1eI<(Yc1Ee)hN!ϑP` i*:AU=%aTȫ1iqIDTOIfRyɒQdhFp zy|cyppNi "Q9% )VvV8 8\J.v-$+_u>Х픡ƚ_b{ǢmNYR]8$WxIX#(jS5ݽx{>ł{A:˫1ԻC APU'k}3hذj'] M RݔuVNhN9dtV.5qP<%# Z+dk:Ye/5!Ld< \2f6Ujsdefz>?J/ kG\> IMkCjڍ'Z]8,L-Φ3vshrDv4Q$@m4. C\7D^{4B!𢡄YB/UϫD_UgO[x6VY㬅/ɿMcԙS/Azyr%ut] O+<5lmrCe}aQ%j|oR0Fz-Ȃ3)irzaSu0 KPHk4wggmw6g6WhV?*9j0݃O-mw$jl_kު~c<9㾴\ͅ ]d0;;Fupvo.u9I JV맼"n,}%Bl#ޗ@OWwlkBDvV_{NQ K$9QTE䌮H2P4Ѻ!w>LjŹi[yYDMΪb ٖõ-@ E֍5nd-SPI&z6{6>nP]M;m39ynEjݸ"sLf7w̆}sX7bM]d%wW˞ދV7zvXJ%FQ{դb@'oj́rK([ۂBaZxzBem͖n(?ϥޚ6ɟЅr:l@oxc:FU ںd!ZNavX ڒ| v`HÊ.L%0O+S&"e_Ku/1EbLN.KBlбAޑBX PɀX΁*U"xȒNO*ɤmH16hڠiԴ7aP{]zWϓOPv~b^!~@8BKkP)q\ylR_+8z@p*v+y}-Kv@xfc/$֫u{:v;op7~<0K_d$L"&Qե\Lpdna{Y+ovkJ_iǝnBq6~La 5;:*6U۱Cx2B j8Lߢ7o9hR2'4]7Ouy5vŊцjc@)$Ӆ\J>, k9"FF64%%Q0" 9V CveRL02tFRVuVكٲz}5Lw'ei?n#4<(˺Bz j}B"KRLh Y uWu)>kiEBa-$ "r^*6F*!YEY]#u7&8|YMO[OLuLS] QuwG:KJ*-^z d4Q (gY !:mTYTYYcb-i\sw@&)jr_!Ʃf ֭ Wyi-mnʒy=MR|Q_oҟ%Np>G>qG(!fUEP-8,ZGF@0)JN Mˇ|כ/t𾆪C}]n/_u:Ҫ ۷Y5ɴ5f ePPJPrM5&dOJ)O"peDC$@k^r > 2d\tz$Gڗ~0(*J颁:-К!j8{YczJ}UC/GaQoe'056P:' G R.XTx.Vb&T—j&mݢ7Au 84k0FkCru+rt jVhd[D@آ,$=jN ٛ;B۔TaKj6uc=k6=׫?߁7"ǒHCڱFT- 4Dv@"H,rMhߋ/ vϘ!8i$RVD 2_ *Eg=T+'Fq4@fP+C4I}I{Csc,ZFd1hs(* 3p41S^Jf23//XSx7M["$ 1(=$Afc9vVAM".#c!==k&"!Dn>HFC(Lk*"ᘄT)Ŕ)YF4 Mԣz,IRi&!X;GLNcMa(tuG?}O={B@69~G޼8L1҂!6m޵;8ks-@E[k xu|:$wZɲmJk (.w.ΥΘɆkBd>&+;- X 3*FЖ&J1VzK,'fc k,9fI꬙Nۗ|Lq@GBG $=u,-"lgq d$E}y %6NrtrIo#iAKi۾1_7 NC :R6@Ҝo_,i* Յ 0A*qZ©kmԵi9Mn*`hQ67 07m , MLX|:y[pK &!CtR+e"LX o#D<3H" -rpVL) ޭ:gN+ }}2͓m[GyGh]Iyuq^묕J4Q hP2hX lP w~G6a{_w1FqAc:t'DIgbZDO.[gM|)0 ؔG>K &huBwV8+*IU-]Ջ/B:.xtC&.}ޜ,\=+\r豁ۧ]Ƣ<9gb|.;0Nr_BVM&k|M/[k4ȿkDxxay\rڂ 3`]KD:﹥\Z'3;KĖfޅf:e.uq$}fh˯%>~i|P3pq^DP\̘crYDmr[H`"S2>Xx^;VbHKCwA7`ѥb9n?,~&Hs}jUH[*, { &|P!F)@-بXIp%'.=HL1 ?0&gw{zS~ pݔvD)֢ݭWTY칩Î<46q5nl[8s`́1/ǘ;Œ50řbo[w--?UL;HXUCUzhQRa?񋬤mkȸݠ1듔peq1#N"bvt~5r>֏<]96w$c:x%ukːtT1hL<>bJ/Ȥ ,q~GٴS*)번d@_/E_-uhޯxʛAQt[qR7߭ ȿQV3k2 ΛhE, ;H>Xds~r L>F/+毨]6^V:~R(A?^ [? ޅO$:Xi{zLN|{`h6FD`q& #k}yy/M"RH/I2&&Lɒ` Z*&SVGU۞,yvmR_Kd^o"`yez1bBzG\jo_ | dSIѦMs ]$\H {D3&MUd;P:pI3' 8IZs.*V7T8 qɔϻz&ekHޗݦXdm+2dRtWxl/Kւ"nMmP ZOB^Z吡:#Oge=?+;?V֋ֲhjE2LrDɐ852 *Niv{ZI@)'~L Rkg 8gQe=؀$ HITL,ORw,0Ѳ[ZΑJ/Ӿ#Ɍ+)gu>cē)I]{[z9!&%H!DRkN0:'WXnI6 *q:efcQ8"9ZSRBШ+bqǖpoҤ@pecȟE#[[^+$0#昴$X/B0PTQfk7(Hd_LIRA)#e#="'0Z@v<"@)K^ɵ%i bElI|wy\qn—X ?w^8yF4fRnMqATbu}1U(};ݚ2q$޵~r Q(oi53jajFvnIqkM}+mF=O&2c U-1} Q+HN[[[%3WI1CNJ"ȯK+2YN,3HJ:tfn+7FZE0P$Z$2x.dtG@d)+9Ŋ\,V[14[ڸ\^!2^{{ RF}C'&q'֧fюx xBxUHy..ГzMi!cWĆI$ 68Ѝ`|^FćNBxluF#OF.R}Hx>_*MN7fhҦ- ﯣҥr New/]C>nY_s cHDgиe"O2Ψ[:s.\"t$L[^")eF_19 $7Za djLlYW[aM7Bxʝm6L5:yi#QQoNT穪R^Het1{όt$ jI*P\r%R49)*DI~򁻞QzUkq(xؾヘo FdO E.B0jn΄홱MhU)M ar`вoòTo$ lRƒH)&OE6`|2@,Y[TƬe,aƘq4v3j96xכðY;j0l'T-ڮs!Ai}«ƚn+bZ>4 ј`e#ƙ4JV+yPƋ'.H}F+0(h-r"<*e/Df#U ?2B.vuQ9Mђ!1%c@A%y4 k-0n|~Jb){I>JB10@\Ĭe[|N,U\LV " G+[3)& P,X29 ̃YE }.(TYRMCrrYHBbNjZK7\6&%ɔ7\S(0eLN4 Pk+Y;v;v;v;`(y%׻1'H $uNhD=wM`OQ"DΊ[QkF)b]6b6n:i Ԃ3UխR[RْU$"Ikb^3o< q-1і$Gj~qۿ˪O4_|޾^u{`Z3~tnp9 pGw}k\xäV_wfp,})/b?Ϗ2 ~Sev6o8' _{"bŷ߅gGvp٢-mh1{-~=89Fr9vϾjG ~l^$eQ] ʛ^9[r1nmXZVgC߽~5Pw>:T~c;llFIGۧfNcUw'QWI)c7ÔN$O,{x7'6`eJ%խNO%WjlyXmijTMJC|O(jur  )D% `]EtJyUA/Tdhg;fu{ WS.1m -9 )p/[6ڪe5p|4Re8nu<$ZE.= qV(%-Os3}mro8[^"}{^Y+GO2 ~cM\'W7NkZmáv+e 纯#g6<9F=[4^2cήq^';byNzƿ7=Zl7uǒBC,krɜH{ä޳q  Kgq"㭗'@4t[O[?)ם񮷈~Aςopϳ[>g:{)!Ԅvҙ4~[N{Sm_%Ꮁ=O>q;[-O'@P1JsƎmnB1B~ '!?]|Ɠ^:~:~3Z^yTzSFŞmbh"W\uߏ 㚪zR&RQF\4D掹\-q8 n^fJe\kr`rh}9rե ЮnA,"7+#Y,9f^ lhq.UxbjWha,ɶ{<0-^0ųN5\wzzCTɲ yHR4LhmFJܘe]nnµ֪תpiz]k-I梁#qZN9 ilՌsfUU̵@RXu*ƪI\>×vF0cÛ1NƣodLTRk!0c5{ã9Yy%nrM¡㫹E ҆m+4JHx&1V 2 CNw`s1 4Jːku&_mP֠F'̕9i ÄΐF@ݽc> aHr.$.hiM[$kYǠQdOIP7'M-Ī!Ryxh*c$y 䬾 zn|vFiѦ`=vV7p@*XʃjMn]9$Eˣ*ֲ"*ޤ8e,f9vX-Ҽ>;mERilE?!oVSY2Y2V(+,S;d68I .OfcTi,s JP W4"Pa. W8nvPL&@:9/g !./5GȳBmTM/]I1~hd\AS@d)ĐikE"Y ݄aK$-0&lJ80)8ЙVNecX"6V 2I_~N .;f %ad_\Y{vi5VJx5=+(IH}^R6-,X\=Պ!ѿ*`( ^W؊"PD W}\'W$SoQg}lg +v*6C!DE HbQ<ږ =]ATXd t.1#dnІE8zV"5R&DndjH 5k*JqQڜI<(0Z2)I1OH2˧{H:?;(Ws΋&t Be_j;J* Yk0m@ Y,gfFZV2Zcԋ@+hB?H(Or|d8rO^S:(0q˶ %.I%n"i0?mi3φbUuD|٥P#HhIn#!2L.P,9wQ@-0 U&y]p]슄ީEF8Ø!ՀzЯb-n\.W[dpU}W{^㛁Q7,<-Eަ%%e.6׶_:=薀`mT%2ZlI˺]ڀ sEDY4i7taVk؜m\FYus=>YO[ v[[6 PE]U]<~}ߖGev>xV7se4KZ1g#;|p}zD_.m7JQ4 @R bcqD9E̽,!Yce3l쎕 "}%όQ.q],(jBdԐB֙*#+qW<87q;FwFF1X,7Q[c_+m$I60%}0`c` ^,;A)â[ wDIHJT tJfeeEdF6Q.;h!* 2f=cʙτpxxl[t6F/TƲ++gV?;WO`vu߸6 #ty[Cc{t7?+Rj)݁qSxb0w\QK'U3W_yYE2 &8˻xDɯCtxv+<:o17zOm_6 1SHg {!:U`eSP"ﰠ+réM)t1 ZiW5Px+!v>^jﻵЖM܈|Yņ+Ma7&iZ@73LQBX F&geVe%gm*lW9Xwdg\o6/R'+CDr\(a~JVAAy3DIbj|C9BxsRڑ(xIeCJT+-g?8Azhmև >Dm}}6Xa)gAP>L;~NBJfv xMld8,Wr@H;-¼sxwO^=NUbf37387qX7͙K_U+eiTɓJc]a.Sī.Oz]txUzAP \$d XjfM0J PpLYa ά$t&/MGuu*o6!YB-z5b bJ!VHKca^:PmV?e%Ts_ g yr LyNߏIt. {ӻðQzsrc+!-9͟)_ݥ x_qP+Qz+k[ͳ},:P_-oK_@xzM?kҰ7qFUmF w7{۫G߇l&-ȳU=34x{~wl{_bQgy &eh[ D=E{l;u.e ab.X\IK\^WiΟbFyPY0ljLP. F܂ɧ$ K;l\Gl3UGt#^~DL"1Dl7eg#1 8/D%mI$8" XٜLJėZ#t*@XA/Z V^'xA;9}ߨʖT/;%3Ҧmf (_I*6Ir>騣tsqnfϮyV"6 =XVM-ϜʬHZ(r)ҁՐ%:hb9ج!.B˗RS^K%;&p9ڑ'"ڜarX$ |(?3?`TG3jRF/sPY;V-O5J*tI_#+2ŘCGj8$eߥD配(2K9L5EGW֓moBkd[|^OCqsPX*ޏQJjɒhDQp9ȣw>FPC,e"Ƭq$vV#k{ SEח|H(@J2r؃+I. v كi`$i> jf®=ʱUa4~(F!r-3:AjT }}"R5C[-Hbg5 7?ͳvKDhl (re1DV \啐€Lu]%3W1bE)XE@PaRDJ;3I(ɉXZ̕|Npvϊ} O8eGA~MvYg(7m36ܫ;tttg!GsnH4mATxP$٨%HQ_eX%8RT2d>F挖0G}7?t ]Bބ :20J,EvLj z4H 8՘͉+#9iUThaU"W"[y-@,|d:LJ`<'w_#ZZ!:2Dpo x:g*EKZy*S9}jIk93C׏ٿWܲL^]7߶UwޙWԔv0׳G JH+lIRQ%;Q]};U¾(Fyvq'ҙ)Vrͣp-eeG+Sr))!jS^PcpMD,%u;˝^aFRsb9L6_R\~lqDy n=6o)&\֦iIJyHkD\zz46sƻEܹۡz&;BY CF~+}uѥpm•к$/8`f4M{X htȷs逖g`OML\.751qB_zML\"]MOXi,Qܕ86\uJDWE*"4EE#Ƀ9OXdXd\ N8h2 W`t{ᭉwaga &ӻ{5`Ч Jv|8&Q)ML Eʜh{YZ'.Im;Iw\E͂\]zO&|Sޝx szĽ'o}tP?Ɠ9mי'^}C@=% Dex$VV\%(sPj5z .T$c"2M8$qB.6鬖ʹ)QŶ' =r^t{I=^12vxa"+n'XQ!qTkdPS@J+g}>ڻ寮ĮmYv,dYrN)nY.yl_YWW T5AUjRyUNv鬬їgeZY^uZMRLGS- ?"efI&aHVqJchQOr2G\&|(c$r`&ԝ hI; Z΁R^^\vTG-emFJg9!)C ^&T3ko%B3_`@SH@td T1|* jZ|$o3W10Z6L7M Ndc)9@ X"R~l 6 ӬTˬ?>kJ8]Z4LMw$c9&%VvB0tV F Adk(`7f}Q{PRsѳRA) #ecG@^IcKSXA1XP\2mdZ"1+kY aR:BX{xbHz?#F7NX-ҫ WuvN' oÿH{CRi>bSp1o3jnrFę[i0D&z_nӴ77L9MH޿乿ͻ.~Y:ϰ7J~ aS }$q]4fSW[27ux*xPGHdm6acاN++ܛIrARx^TU`h^r;ttgm ƹ|/N֥~-KO/Ys'2uF.$YƳBMQY:zC|EHMh bB!l8,)cgz9_!4uYyD^ 0<4yDJrӤZ7x:x*%j@-2/!nJZ \1Qj(ϱiU.DɱM.d&V^oZsOjHUGAqѕ_u,K]9O. (%ǪQ[S!0<)K!$z0hUɕ "yL6g&sgR%c5rvKƊVJZJNjv]u]=gOoqQl͟/hp8^Ng@@ DaqgH3HM4$#\2f]3c(Ni{#j ,+m| Vq T'r3DC`JȢ(lԤY$1RVE:x_aB~=z#9E D.@ Z@6F2^ (BD#GK"1`ߊj;vxfVgwVAFoZc2Ԙ_NI&~@Bރ#BP# $4DtC+tC+tC+ 'w_1ŭUJ_0`JG`A,34*.يJAܶYmC(Dְ^vZ^}{4)qB`NZkTkuZP)X_5=yǔWD0).+jw*⊨U .Jl/>҆+8ZX"Xs{2⪐+B-Bka?2%NhNH\5?qU%2OD\j]ArWq[Y J?,:~kt+bqL}R`g?|;ggxOsa <>}uc_I{&Ӆ\OELjt]L[Chʏ(eUx:⪐TUVt .TJ׋w#^Ǭ4G Lar0jzqu1qWWG?zg%{tN|Uߎӟ?-sSYMjkίcɺp4| MPdrzrΏ~青}Vv%B=P0ĹBC`lfDϸdɵvelc xǃ˾$]ۏՉZ&<=EÔٳns |N?^ҳo3x~Lr;{ E e6^6A h2 J}QE4;*/zV~;geؕ4\*l& ; SFmIL&b2p!kOkmF&sm9mD= ݥ"gz/vtk[^W._uɯUћodbj㬪nbSA|j5,QZ77H-͆˙]gTvbsf*]V)sOmp9ɰc)&Iw<8Os Fd"!/& \Bdv$;5#}Wtڢ#ǹ xm$p6u0;0@O- r“$5$;eV z9O67\i!4 29HzȊ9/b0ںːђ JUԊa'~ ݞhNR9fBV,j%X Rb %yZ ^pTS]_uek58jHg~&n x5QtB4 $S!%RƂenqqzzՋmlJF=iWWEX݁ly> p;Cohps9\Z\.dէ>o.[N}.uPǕYp掠G!Y2y _ %l)BV'7 6ܟ/pK_SRA_4`F~CB=a}74V g 22ߌ6x{jV;l7xum?cĈpRcWGΟێZ5E{÷q [Z#]~?`NEKlrcA(Js5g9A~~og,p@:DHKc`۔93NI#SșgDžsfDU`CL )P,%g2H;%yG5rvw\6$'WMRȬܮNv6GKBJOOmC{*x!v:Ɓ;=P;^h_=7VЛS}Jڴ NnQ+_ []wD\l nhu{eu3_}{MWkvgOm톴͂[v}ݛ/{켶r;?L--~Dwwx;m?cakg]^k./tS MfU39HþK+t XH⩯{noFyyu|i{/䩋{}>)p*&D@a3bk|l&o[=z|hQJg2HQHa0--z%/{ v'[DIk602`܆>d3<V"ȬO6 <ÎtX#}+rΒ}t'Wf՗32iLjsҗJ7!_ve~Ye=nzt7/h!@3g#=)/vOBVnD_:{\.[GWgo  PG|_1m`}W ST.R˰DVҶ5d\bm$9뒔{%qy1W\ZL17ߥ~xȦir4iͦ˪N'?J>RA2ixA2jaMl<9{Ws5 9NCceb $o]nP .e\'ɋɋNdu JZT Jm>P2.PlɦsFis6c3IWm_,98K0Kfh<ØW8_1Sg}^MaP9bu<@)#!: DQC )*9?bS**8yQTֻ@#ػ9p\*׫N_}"}yf,ȳTNEKvJcԤڝmgXHӯN{`Ij]whߙRY$Δ oNBK@`b)U6 {C.K("0 q@.)$ @ɱ"gOo%M4ܐ} ގaDbN)F bZfR1-'7glsɑcDbhuH|&dj<+[n.K01J dc9xց"m87Dj  NJ P< E{#,؞qq{s0Za42-#t ]Nh"* 73ӳv{n-մb,'MXR& xp0_H(c "jDɨUTH$#;{zD^JeuDȠTCƑElr/qD(=K?" pʴ3R '}y˴q.GqoSfzy>*AIah[ݶ۫x>uPo?P(mx29ovGioB߮\P0X-W..M2cui0gqr\σ6>P.\;ijp}9L0zTš%r'H{𧼈EEM|?=sgS$MRS=CR#)),ϳWZe7^$]+&nXI djӘ E&Iqz"`^rts)Õ}']uBOsN[lB\:$UXN.Z`"Ô>$=)x?_ƃ'jM'q 247 |]Fp岖K^W`6aWf,kGz5]LgKsU_|yUOi¶oqz/+Bsxh>s4Gƒ. 8l#kR"6̑P=A icE3SV1FGRG$F/Q 0S)llHGC%oYGJ'JI=vA|AL%D`p)&{;Y_rons;sc̱r2r$%-jH)EԒ)E>iLy. ޢ&' mzdCXi9@NCw-eyj1߮{R6)|/*| bץ D{w$=. p%c1uH` ZGK!`2ft*;}THbaJ[pGp%=q3~}~߇w}v,XAf:_9,tښF;|k2)u݃?50|rt0Tq;@8%JJQ@S6[x;}@Wq5*bDڿAZ8>uipFy Zo]W>{~q쾊LG TX-zg՘IDk1`yl5ͭvfnI? hXbʤV'{&#U6V:'1JoZELɼ5eCRaN>G|0arroDVEz *I)mQ9v,dۢ4sm?aOFߪOjX\ON*wW>rm;IUAWg$Vޠbc؅A: ֞ˢQE=0N_{R+M9P:LT:.a=ƙ|!$% L\*aUiTgC{%;(Wb`SX)\s !KRU7ɈV Q sVA-m=3CJf4fd#g(^>nw[Fdp]hgf^9~>:kz M&}YA/z%׻n2͈O1+gj'0jJ-ӼL%Qγ y=ZAjH`F) s1H= 幎 $R ? h!,LVP0YHMLc1>"'pa y‡^};\s|kcnMȂ!0H)JZ4xԖh{rL(O@N#!r1`G Nbam b 68FutAq$\@KeV@K`UDj͔pADM7{,ϑ%7iQh 5(O+-mSw7Xcww# 1ǵ)j΍FS먧P;GI["z4(PʍKyEm*b"(@#8X"%JKF94E-tLyx;!;r 9Lvqc'-[W\=~B-G`ɴI*t*-GZԱITնy-GW wu+H3oKA#.SJR aeΠFltGi$VDz1PFfqPքR`bjK4*5AR鄡]skkf&8?ߤg шu3QziHk>z׎Z-@,|, Dc\kD^Ik ɋ\*IyJ)V;7}Ѣ&ԍILnwv~xg6sVz{vj&=xs V>ahF n%7`mn.R[f2E\"i޹z#ɡ瑜M#&C`w,9ff,3YB5 ueTUUoQW + X0|2*٩DǮJhRNHSl~@x0rm//W *aw9ti{~E `]0CzT`0_I75/p~8S1~p߽yr]VrݮlB"竼wһ2]ʨl7 ;n"''Oӱhd,~Mac ^v> aCD.G=Zr ?èԕ|v1R/ 07]W41ӟlwC](HqFK.2Q볢Q:{A[Z"dQDE1V2/D+m4 &RjڝJ|vkO+?$ R|Ʀ31 tfZk^wa^6wx(* m}@0#dD.C%j:/QD=ǭfp[kzxm={7BrNmܶSn)Kk;嶝rNmܶg)vm;嶲vm;>awJ@W $w\DQe  l%T16I09v`dv W%)k4yݤHMRTS6`.{AxAޠ#혷{]4)}ܗ~yz' ^+vVvM37uz4fJ/6nj77T5eO&zMnXꯚzHHO}^pp7w ku!LPM8 :-/xk"&DHFE+ E4,liM|ܬgP6O/PΫfqvrj ܜzQwU0Q#ĜV?\JAW/ïp/?6dDQ$"{ #1l>tJ/k]`kIѮ@M;uř1 U[L\fL0 %^e"uU&^D.hڏljbKI\&R"’ Rr e%Qn2U($1$5RovS & TDy!*hA.̓E&@M ^e~~/sz0b T{)pJJvbX[251j)g?RU6zd$Q ^H 1}vtHVbRu.yqnk[a8K "iB'"%AKJS^"VZ.U:v5#q@XYB=S$Em%\Jd{R8qg-gO>+wf%*>|Mk|Ny-Zw^qL-J)-s(I#'"0MP9u,& |{/_ =1T*Fhjƒt ]B΄5:20Ա7٠J;Hu"He1h Et49*͍RH0 _u *'ZhTBb /E%y TSz/y(5 I! 106)Q)FXBSpV;/R;@\ޭf8sp["'=7,3/V ҫm;wlj ηw}YSuA/R E@*jpGj"!cJg D[#(Zd۫E8OLLh+#K2ƒڵb zAo"Db:QD%,wVz3Z>Wmm9{d9VMJg`Cװò\RoPv.j4$&LGpGm`*Qfs3DbK>`wD;VBѷ.* na%ByuU&Ro:/>_m^fIu|Y:='u{ lm8,(]2keoVӷw3;tfYwW'f)^)CO,:ذ#?Ɠ6zTGC6h=b&ᘸ\2Ȥ qOli*-)뼈6IE_uh__nW<ݠiwp>7$7b`|(~PpC>} M.Vh[{4BF3 ~E˵㿾lȫlbJ̐/ʕ)qe[zQ|V*,>_ d e 3^PORЃ+%!ġը汕zYmb[>΃c TAls]~.TB^5U} /j~W&&i%| z?c2&.?Q/9=nXx2ryҫ?}Wvp\;&]so!WAU00HŴ=NGI&_?>V;մlTAϽ.N?Rz(ZV_5gPw2Z~ǞWx(gEv #<!9h>G$G+?$sG,>F/fwA:LϽavkf5z2LwRo.|̓]~,g ;<y\|jzBa7$U/ݠ7 ߮r10['jsߤk9]D|^'ٛ5heG('bL.pS  lQߛkZ(GC SXDؼ\r7Ԥ% Jey"+?e+UE9|-V* UxJߗ]ʞj$+o)u]t>P%cQ;n$gMw39RȌs WerL/KsS {Np1Y9يvv^mFshh >j8MdGD+R9m#1*+Jq{mhv{~?G3DRX$k(z+ѦɽdLG%69e}n|V?A?'i/a%>39X}5=t4`Kt 000zA)6fen؋ooZ-`b x! Qd+Чlvd]ɳ< )B}LLHwR !ѯxFD.tD|'k0qHDapa͌@ Xg\Fg}D DFQN8e%#XZ-g\~]'.Kϳno\ K>VddۈeA*/ b!Gnbܠ.VNɵk] "WjlaS>(gv뛾-Wj'cg > e* קJk3lo׸ 5y;'ZatYMZO6~QABu?WܭZ}Əo("|2W#& ?gL'eǷ"'#>Ij'@ha>}}#%J+b4S+TJT(A8h=7v䜥$C4؁MO>&vvlpλ]dC PCD.EVdWP*ik74o<@(v*؈,9weLRkTrRVPjz7CQ1&d\,BsH/Y U %$JA)ъ[ #.)>iiEDa-(M&"Ƞ R::iCȅUAjOj]t\'){cP_65,8LKT!_!'h QAAh䨣]$Rr,i-zc}֚ 1FɳTy&) J Z"n}hD~琵㯠M׃AcyrrÁ뇲V׷w5?la25-z'g"Ny; @AuOv{cWW4>*Zy⯅~7 $Mhi,C('bԁ3Jv HŞ5ڱ?bLpKi  `)U:jn9K-48{K;o3g=Aoyo.>SB^.Fp>g> AgW74AY ;ȤuV MAπ ^ H7#r`A1nx2l%JDAeJINgA!h|bFҊnf =C sb:W7r7Ά|kO!b b:j0^}䂾J|JScG}֣lzOQU6恄@:M(_BUcUWCEI{J(%"@dɋBFLٔ #5Vj"( 3% E~Ɛ}@!v*J邁d#jEYWI;ۗے{]׭^PN{r紧7U@̩`Q eD 1W—D]rP af1_OG9%r-X c3XXhPVEB(aB2#gwBP9ƨ k{ֆn,g9Pr/ f F}ŨP (чȆڗ`c! FTĚԑȚq" AKV^9i"W>@O3*Sʊ:ZS0(18R181ŀjƘt+PԊOwH>E]1ZN6%B^ˀ>[$k,$(*UjG1wC0܌'0<o #C̴5Ψ9Sm$ehҚqk)PHͶ֎n ;n =4J[xz>k"Gxo9@(Azҩ>$K OE F;R)UH:jr) ĜSCj%dH ](%@ M5Yq3rCdYpnvzI.w^|zz={ n{1_]%vQ5T2{.‡_zvfA/>=ȝgK^AwwTK׫~ߘ Kxmyuǫ_Wwջ/9fӝ xe/k{_8[+wlzww;HO|ϳtXSτG\8Qk6}tn]ذKk޵'B ޡ!J;T}s}]?onW)̾Nyәw>EOO*8Ձ2@'jok 90v=yjrO7TqzwU8&ϟB\}^ZIWӐռX ӗ匛|5*d-hA90r҇PE*@e:Ko>X @x^o;V펕Xm˱JkaohwK嘰ݰ5CvVե2I[϶8~Ue܌r3H9L7M5𣀼u>sa@s\OB-^?OQf}&Ռrrߣd"`y>|Y#WgQ(Y(CL SKjZjJ&6߆'mLނg1j!i!QrBYci̲=OLS3g.91 +}Ľ'Yƹa'5~]6s$)W=tP9tx*+edUf`5~`o杶PiXW"Z}\/w1ˠxp:m8ir9iIXG ):HITr\HmG#L񜙜bao~HƑ 'U㆓'N6MUN)$STs#zuo ܙ.\睻6j:7Q)e~|WEyoxvUn IBu$4y,]J*B`ZZ+4j@]= ztԵhM$Fd-b.(:cJ(r1mꊟ,[伹F`H=~;:vyL=>ܬ*- ҷw@I*k JfK`,LJR49FƤ7Ҕ(E$IY&',!KV!Q!R&ZM,T%3\__Yo潋Į&~;S k㰹g*=[F~;dlw+a-[\#Oe=8\RQUV6a$vCda+=e.=)/u~9CV|,EY9gtr]q^A ұDdbզJ? bĽU Ō9O`wC_@%8krtYzґ%e +V#)km#Gvi@pwٛ nv0w|:8W[~,+ Q)6XXU,ލ}/5md4E4ph_ń?s#=1/c^Jn1)JB*vsC;Y|Es&&6JGqhdSAA< e0qnLn_,zʘ t?N@|f6q.mGmJM{y1N{IƯ?}fzA7My(z8]ɿR8s[qwQMi\&yeW0h|SnxѤ\GvH^!0oF ?b0kjޣxyzS85"NG׷]ELqQ{}nP{c}'z:7wuTy.Ol~W{?ʮ}I4QW9}4OP ~Mztz 0e!ɞ6Tnrz$S۶񃻺;m}8W~k1QqzBDpA3nÉoQzDO;e|m.#ó,7*{|5E=?_+ӫ6R)~⣱SMgj{c]k:ep_cQ; l&Ϛ|DѼC%*G56y(}*D\w6,o#˂?¼*yvg_'9zX4ǿq4 !NlA"%0O}mlZ&܆qR'*)}ihvn1FGu ,%#ҳH2!0 r3m &BJ.DbJTsQe26}|Gr0?Zr9[O5NR&%)!p(NYZo$]T ZS{6tkePuv_8>@kҊJ *{ `PMld~Չ=6W';`8b>.w!<o`Lqv}#7EH<4Dojhmeǽv\0N@Cp:iTPŤ9S/pH="+F \1rmA"L%E16^ueq*h^hwdZl5?^.{Ovx+O} mA5$Tj^NM`r{?/*f͈Yo}sv`<>8^n[=瞓&U6F)1ۚ]9C"e`T<j[(r{pͮ]ߺeR)&O̕1<:ՠmN)E@%P] yKy`)4q9dlo a4g$H߂BSQ 5C'Sh_,p6/x1鎐;JE7wULNv W`5xPD93ֱ`c)U1H4-lB%UZ3f͸<.,&BQXBՅ'Q_M?%]͙(]8'oث۫vkl*iL{Cs. 0$ + 򈡥S.Y2[!<NC+,M.$ {DD02W.6]Âqy4Zt쪵ea-֮JB*xEF \ϣB5"E\71]̉ĺD ݨ-]Ru5pM 85I*ʃBry QFI{4(nϜklM]iuՕvȕWFx?]ϻ;woY{'8xRr-ѳxO '`@1אxFKs%XÓ$vtLPpDLPpLPYI( ZO ?bL3@ƉD%UIa Q8Sԥje&(V;q`Tx㩎F&@G``y*R-6-R{|L1; dTLxUr4 e D_0DrMGsbأpVy6cg )8BD(*E{' g{5Mj"'u믲_,Wg5Nj_GqBX>JE-qɥDYJk)MB]VQ?E-ևbŚ mԓ@Qim,gAK%x^Cϻ C0W9w$b$HPo#`8W ۄ탕!(;+{e'NY0G Jொ/wŗͶtyq~ن:'"80Q " !A(S2ѱ@)R9"qN'"]qb6 y P|y gǨdpĠR`7$pcY'QNY.K¤PEsT| dc'T$_¼*G^p^ג>Y^ϕ928|g?T/?\+z5ً]5ɸ9~_^{4.ݧPGw'L?^8/'>p(zW˯\hwS ;5q-B!xclA.^ZHm~q'@޾vlUgFKo $Ö<ޜمD+9s|^VkX+f_eOQ^ s>2&)~@cxt2QA[E@%P&yKy`)4q9dʄުlL THG cIWy#8\7q5buDpJK%/& nq\LiY ް((6J$*ũpܓa7qs~ Xn1taXa{+Z)@Vv X}ж0*#g嘳8MACn=k6^u~hд״Wa;P.bl'y/`v<8qbVgMN|c1{ID($φj H{cfP=!䕦R[L\mrN~pi͠Ȩʪ*^PwIPZ5,"E.Ej;vj2֕Ԫp<}%9Mmehy|V6[=CWճMC|s0HzW0W4ګ f՗Vh\wzӴU1Sal>Y&.ObF7ِ\L1`_3>W8S@˿ΣpBVYeM8*cF=+}_2*Zex>c*> PU-1*?M5 F%R5FW 1A[CU) б=ұ -l еwvB֡  _{8pRgSߌ-;%db$E%2S\p%ɾ'GE+w|_wVAvt p Vb $cT;+9x)([{w&pڧ}G\{Ko|јx_?+u^ .ss"8 AtP;>^G: ,kqNv"k2Abte \ H=7Rbt0<>&N"%'FKpv2D.bێ\lE я udX67*n~n 33tWamT9]UⷨU&BA)%朂h([P.ʻ@ooAw/n5mRV<Ȼ6{5PUnz|;o ?[NgWHmk%\wbRuT́T)"vR'$zYIޣ|UEIى W-$4"h R@ hkPе#Dd Q.Qn- OanxآXsgScxn(zzx7eDǀ/10 ,JH1'֩OMUAnHlp<{r;STh]ly 4~(zBUidZ/W6>jU?︊J.Vӄ0YNUY4/XU[Q0@2}+c++͓#R`ÀZr@|mνۧvg j{*%]֗k]ו'dQfZc/kv{ڢSU-Pl" rXM5b(]Y!X )NġR}k$8ޞ=;+llP9%zrhծzU(@.+thSB&U =*-uƠ`PVPYb D#s[vֳnlgs |mrz_ (1R-Sp!D^\J%+G'IbICRl'&P*N̐vhV J>rRk d >bɍe ҀZݠV<~𓞱")ij) c&1Yn@((P ׌ELjIi.cƗ_ۜx!H ig?Y 02'ǘ/r(`[-YCut3Pq3*CEDD/<dy{jm< JlPj9zPYHJT]$lΙr斦fu02MdԡyXНC Y[Xs:YO#Ciɐ3-Mn0r^(7NM;+//Tq]珜?M2r钍BV5Yki*ոZUO-~m{wQ_~:=vv5>vm?{g9qX]\z~@B{v AZAAi2(MF4&d4&dP JAi2VAi2tlP JAi2(M0h@Ai2v\ɠ4&dP J}VdAi2(Mɠ4&dP JAi2(M^wPS3(Mɠ4&dP JAi2(M1 ^I84i~-&"{OiҤfPCJvuJAi2GU4&U4;(M JAi2(Mɠ4&dP ?iP JAi2(MжAi2(Mz JAi2(MȍېlA9=(MFm4&dP JAi2V1%qM I'֙jL`U08r&9F 6@\r~v,V72`ՀUª;lo' GYJz!He3fgJL섗pJtUO6EG5|G0U+ڈ.5 ‰@,g(q`7qbV[z 8+0r.QyTE%,PXtV"`k8OײOt3w6DOK+^#f;?:/HoVQ+z/r_Nկ.cj-T8W(<(k^v"Fgxb*h0ItFƝVtfR.R lUƘtH}sb"E̅@[*9Q]֦R0T-RJ mBGU7uN&ΖrWyǶ>;=n)^iΗwa֓9;-ښoq>7k \.Ԅ}0m~5\,\z o5eܶBnmv?e$#lWlg7;H7͜iDԛ6TW}%ӻw멹mp״P1t=6u֡y_7hl6_yv`!Ϧ4)f]&SߑnS7ܦwfRnw/ =_ǃ@KХe@V>T*UL5֫0d`cFWkT8U[Z0j$0eYŔ1nl?ϗ} Vsh6jk|wkaqFϜ9$khbPXf!53h2[V;;m._F1ax}ATI}"˥0xmU1ZB ZEK&:(1Ag2޼_V<.)^{_g-gGrk2RK"/xkyؘNl4J,`%GuEQ'QBK9XXkzykʇ`LZȲJZcU+(a/1 ruayXS>iN&`8_rzO=GvwrcRDj[Zl %hX 6dv wP5zZmGCB(.}^aě/d*퍛qk*FHX)Q,[r$ec~^f&^>]UnYޔ[/w3q",՜o;]N U,XE2u*x$ɘ}" pv5M [&.//s/\#Mkœ[.ps=|qCd1 tʣR%" ̶Vd2Ekq}D+X ]psA ~`kM\MˉbAQI\L"/69=>F\",67 a:B'Btą{o^q1GuX3J^+(_v-WkoW+p>( &0c>&>!}4/Uűw|vi;sr&jbu__&m4vYc4),z1++o9[L\K-q~yχ_ >ӢG?{FJ#Y|f;X,p3Xn'qWnɲc=,m)nq"5]l*֣/ PDR'HB ލpZ߶<_ڗ|-.m5ڠIqWmjUywoMZt4k9=?TzʈLb+&ƬUčtId:p(k2:Ѐӷbӓ!6ZNwCvz|YeGED/(?UR$cb2̈́, iRPʹ)Q XOuL.l >fUy M(bﻨ](Wgu"NB>Xh2$Mx&9 c.sF.$5c`Afo*Q:pI3' 8IY%-XeX{#g ~g}$ʓ.rmaN&5ZW 1)f`PӗfXevcFFmU'!LrPeOeǬeIߓjY`Gڋ*_s5?&eJEɐ862 U,9xQRN1僐K %EzI8 48iy G" {#gGo)M/2k/OL˝5m:.w )C`^& LL۵xH L5i.9@*oL0Yo󊠭Eg5K,`ykRpAgmfcQ4䈦,+@#q=v=6 E{F=?8ΐ_?>kG xV2M$0VGB1i/I >ϵ5Bl viyW)/Eª Bn ]nӥG ŽYS(VqZK؊) ~B1732rhc߲䯓qc*iU} KX4kz0z٣Ykf\y5i<.RNDһCmޅQ@ szgi<+\!owޮ=`$Z<༊I?8iR/S\5Rkp gbٛ {F1MمVBgn6KٜgݵſFFLr4`=X.1ԁi&Krgz׎~-r" Sz.2r >|[8z⚭iwO,{~W;+?V5_0Ērak*W1^kohp _Ѱ;nB,[5N? f ~*bupuY8{p(]>o6}(w=׹Ǹ;:J<- $Ӱ?HX<0ByQaɼRIZd܀[-ZY=yYDiI<glC0u;"(/ךзz9ۗh6Z miZU7-bOM%zNh>ܜ9;namNWM<9oRtrk⦧&lkK~yTY*wpK]}dK%, 趸E, 9J)BQc3#mT<$zR0ly,rFST9.[2Fvɸ,3gY,|˄b!^3؄down}toj|su9DM^&!È̊W^c$C+}vUtc<ˆGX El BnRMhU)M7rKl?hv_ܱV=Km5Hnx튖6aG4Wޔ tdHg dPHd<7*To$ iȄ I"i51xũ8 ΜYFHNW7rF_/X$boD=KD=HA"!ƔFR[` FI%Z$)p0pö6+mXiGNz=RkDNY5ǸmD# 9IjwGFƕd63G6s\~1{,jkp )%稅u) cu]*x_EΉ|Yk#=b(m}?7 W,^'a̪ҦrIP=trE\G O_o!7QkyF'sAmƪ.[}LSgO_ȃNYW*z\\wodNXk ^z[g.-:CfYrfYcw\˙6 Y-5:jb9.3 >̮Clxju0;,G'qCuB\!XۓQ ƞ"GǮ*a;FEܫSWDdU!TUVcWJeqՈ+3ffFq6?ZBfQiLn! z4P/'yǜ]lx;)"MO-sf^*]Ggm|F +0)mDf SD:eO!K }R\wNq@O//R*jt SiY?4Jw&y7ˇO{T+,Kb? 0dJ%هCI$Pv(>_J%J%هCI$P}(>dJ%هCI %هCI$P}(ް̽g=yFtPY1 凂CAPP~((ڱ]9O=$<@ޣ.oeٷZ}-/1S(6dhgwE C梥}ZI@[UlOU9|U7ӨS=ސ/G`tᬪ]N{[lM};7?>75BCC6TUX{0?3oZÒVlDmR+8q+ A*+0'J+'ZTZ {P+QWr A'cL'3*.)P2*BjiM BV0E* c53H" -rµ(;!GnۅwﺿRܷDn$mV,6 3;#2ow5n_ZmͶtl%ȥ6eM-[wv;;r3?L -nVnǼ ϓ[t׭quKѥm|y jʨ7D-Ys1E]#v:w9fc篯)rIjnnBfu=8yxfy\rڂ53`]M4u~:?:Mr2㣯6'Ӟ<^D6 +MPxYX昳l欤T\c VlsWGET HO#\Q[\eVq<9#T`ƹl6F1.ԓf{:&cXX6+t fďf>pY]D?*{?y"bn|lO!RZ/5Z@F3Z,2@JI^2uRk.xo$r_s '@Ӷ}BbL@ȁB* 7)R C`NJoИ 1NE%nF'%WqZw)?{WƮ_%/'%$U 3W΅J?M8${ݧd cdl/BmWζ&e;EUKJ@>ן|];l5)<͡k~^]R6T;[:*Īz)|1QD&IЌ0ryO*ھcD Dlkܤ$1aD\}YQ9_Cjf#C'C-R4ڳ;fl/ 4ĻN+U ;qDk & & P.)b/Q. E!GMң _c"p[%">>cyx7iesevqx?sc{a@)ڐɸ{[:2Y=Jq]sJ*"I}* pq&TN_[&.~oX2xt!F_/Wwv)mg'_x~fkAgyCDz Pb1Xkb:!JlRAf8'.<߾ހ <=XMV } Er*2Cq bv^$64祈LG5fn4F׌Hs};u[ p1#GuZ3vڮ|rt~yt|UB{@p"W`~JmG-:,ڲأhBE\v^U_*BPjŤ)5ijy%(p8&t<,8QSwa\CWe"sQUsd MC[0P=5M:}*:o[~l.MH}8'D}NI5 !*$ }.Q ig?ɳށ,mެ}MAk Oq%d_1{,5̨..[5}Dy5 =NF92,'&4Q ^$}X=-:gyΏӪ/WY\GǷkH ~`vnL'λuX^s_ ;K>!b\|<fW%4QKFNV_06cCEr2b"HF+Bd 5ƹFN_{_l6D#B( g֓t]Bǐs2LOv9qsN)MSʛw1x}6o["LQE P 7$1eo{&E@D DS3e4[z70Z{)HFalFYApf슅~0⌅{hַ[$-޼夀_ک~ߜ0?M0%Xl $\%P7d5fΣ$Da!, РzըXS*`gP dukLpNmTcAa+"gD9oR"ma4,!S@ VUmv [B6P9W-DT@:HLXC &J9'm1 f+#0q6#}ģ;C3Xrf=iob3{.cj=Lfh%)-t|69Mq1F+T&$k&Rwg{#Dǩ99&$@2;d|"C/񙳡ʡΗXyy@!a -vFͨ%.Y\޲{^ ~mks>^%}@bfz yWuNV_y\46Ɛsc'4MO.&Or*;U =?U2a KBSo6dN%. \ R+xv oZ`8[$[˯gc>r;'W%9zE뷫\ Á "o]:_LI|[<#vK 3 $IM y;O9w)Z-_|ek .]`S78A($S1MMj+7V={GĦԥ [@B6b:%"QYE pmalSQ4]{I^jO1|n2lx- z.ӋL}D}֪KI0cs1=bү jtzLBL.lV[\3W5U︪ %ib#f r XKkL +bc+(FDS',+KovY)DgP b#UIcյdQ,5=}Q,.aZ7\&,TIj%GjE&Sc[,/ޟvٵ+TZ)Gî9["sسNUf %\ a6y .l"=)v6L-liW3[*דU>J&i%0F'$F-z&EҘ:剺A2ܪ%=&0_8}`Wΐ GP"TEPk1,11W0g?i^"%u}+.-7'R 9#L5prR X2uՒ#8Ǣ^<;tb;#/{ x3f>z[$z 0ǂ.9"IJe;pffEDToWu fn0qyWp9yHq;,3容:!4>I,9מLTGMcyܑ>LCv3:G ~[5qiyt;!oΒH4ۄ97@q{tr,/i_&աz=YFL90q5nl窌Ԗ6bE{V*W'rޝo˥v?O*ݧ~Y4u\j2IW0ɨ]#Z}Yi=)Ϙ^L,&!{"d\| =~uZ?d_%KoO?Nŧk=/?tE(uɒOM^~xwm7w?,Ww_sPzo}Pb;31-x= < ^ĕ'OD*"OwIIGR;ρ;P%*HQPc$ 9~0փcX?VMcc{9gs̽ ?3eWn@5'wH~n9" ˏ?ځzn\gvt vlpt8*&"f.G$.+l:(GS&Jb3Iz!]뙮o75BttzgsVi2YJgJu J \RX|=tjnn'2zNʗp2SVL`0O⅕by+J3zW|*֛sIog0z%APN8ႥMf<6WLkeYc6~ոwG֯I>`\^oy޵Ƒ]T,J8.< D9}N2_T%㾤#۝IZ"lf;><8~*HouV ԋk}8_.tP:WAnb!fjpDk'cEdO@7D$z &یN,[ S3)TP!:L!w5ֺ&oy`NU !2`D2J JHR4):juwiO:$n nɧ_4vnaWSܰn)[Mvo8f=CVÇZY %NrW7ԀۨLCsmoN=ٻ9_tsRwsPa%~57=`{g㭏v\!2/ae:_H:o3pIjmzGF4ח>- fV(_y/і.:46;`>.J0m`~6[cN}Һ}'dw4WǦsݸ\\gG;KAV;jvP#X&q3)sFZډ7Ws߃((fC5Q%ͱ sz,\fBJ})Ǵd`@*)!}NVsBÀE0b@jHYHSI\֏%y];p8h{m'O}:[|~}Nv. 6&epWCl-;ԙ QD8YgMO7ح0xL6,Dl փQE|5K$ ::ӷi~N& oyhB|~EnK\r3j+qQO[gI777R^!:JQQTЊDYA;%w޸b!J\p2< 2g<6ɇ5ZȰJZcg+(A0g!2t< Nxo :64?7(VbZVBZ+6Y *r$e€{?^f}>-J~| /K r2|_>P?v;nB3Oa@O$ړA|kRQgҪaTp@!f09^Fʈ[,dU.,X\e/G$z _/Wwnˋ ϗ)ȷ' 86Q XfSkd@|()[VB8.8qqbϷ)a'0 (,di>3Ȯr2)6 -FA|18Rb(~oxS.e1ޖ7C\J䐠_,̾hVP,ZVgfh+H%TD86[ JUBJs1&HllB.'LŊ$h.{DjV;w~ ~g Gm7~Xv=S,gJ " =˵5]Ò)=bA $FX>Ye} e|^u񔽬#^񲎚v3 ԖQr΂a(V\tldUI>(N!ANRe}?)څ AkU ͇0f+@́ k XþmFYɺ0Dž@G˭E>VKm@N6ي:Y@$sbo\Ӏ`MǓܷ(P"<%i`"PI|M@RsUPv Z8j*L*⿸I_ji:Hr5"vh1ŊzW\G >\Hgl*Yb*{G"UcJ IQ2F;DԔ^{Ok]+\#|e潹#.?@ܹaԿy9FHckՔ-Z8or{﷼A>Kؚo.튖"j .;Gi[XE_.Ƞ˕oXٿUR :e׷roZY x֚sѩOpy|k|H ޞ4;g-Hwo74]~kIνa}XHZdoFoka7OQ͵zs1`XݽY2 :^TQ * L5tV*bM&)%|RdD;iߟh/q ĠO<`5EL'|)u \ F٢vݝF_ 5 H#;N^(n\MrN~pi8B~(.)5t,2NPX- nF+6H }GL`tav|z!{wGJYŞyd{ݠ!A3W«WW W\SY4يuZ kʠ PO'ov9ϫ Ɨm߳, $` Wwg^?#1^no\Vwq>FoQuK@̅,*9EAWUJ L-*hRt=Nu7b^Nѭ1Jt~ϾH8? } -7fWl6r@k Pk}٪uˍ~VcfW_ΐْMO Z$j sg_t(_6Í,|tʑNejd:E0m̉زEnF^ۑtE{ YǑFƥZ}}Gi*4*,jccmi5aT@A-jG9c!wC-)tI-I֯V/^p6U1R-c췌Qʫnq-tml3o8o?@ZH~nW\(O Cͮ>._><1GrlQpLV/M+ ʂ=X_IԗĠVnC+)Cnf%/3Qf#Xl;[noj^9'c{IDZVw~%6D8Qm*I~)A{Fa d%>kR]qI#QW{X):! 2+^t6VkD8RonoG~ XnqE -b,dwI|,j#?"E$2,MѡCTA|\J`7f'`d6;C2Z!Ɯœ6fMwy췈[ONd%'+ JٲsTfiHKSS+EmCٴ#m+B07]*\9A#&E%҃|Vd<)8*Y2(]6):0M1dJR\ &8HsH9Ƹ[ TQ9 5X-jer#~D\5Qo|Pn4m:c+'zI;/|w a7J`VxqӋ}^xP93&!6C66MdUAnHiwZq] J|,294^7KNgH _r˃蓿3 E!n2t/)(xp|s:{cz>M#}s?i|XiDҬݤ9Ŭ IrƧ@.nBg}.[vpg^,kAG@IOt* ]~n_[DհXTбkh|]z{L7h^݂oշm(=&.MÆDJԳ=i@~𯻿zqdzQ]?PCTۜSV eڶl}A3a规*Vy*aXU챫٣U sV<a@QX(yNX9iX^$er+8297tL,(XI BPRhs㹂sC#ĹaONv X#_ t@>whȖLC=ޞd+BZeƥgA@dC"ι3\-EK2j} KAi%-TS\Wco>َok|Do;*۱8IcD1D fJ)} uY/N@ͬ'@V~Q5.J룴ϵ{JxH!&'~ 1jfB:")P֜64Wq?..ܟޤb}&Q|OMiM7*%͜+JCp:p4W^ ҵӗ^}n~O>Lٯe_cl|w8,V5`km݀ZOi=BFH"X¤8&JlـOJkD"˾.g' juyDY7Q>-lb1Diq=҈ 2BԐ%Ȍ 810MNiw{B) aDx6 CS& ږݡTr֪{P=iSM[eR&AT$ S̘AY%)X*GZ2gQTMT.LJ-%T3-jgG2G`M* K&#Z`8,ʚEo D* 4&agHDz0zN0A :۔'ʁl0qnhg]Y/)*ޕh}N{8zkiE G8$D㘓0PXM`HxQO?fp+-xbFͼhy}^%"q"x-P:Jn(VPk0'87@ߧ)Je椉1,J&NK10FH9`$AW cg,f0SLܝ1h_o2MkeH4Nh!II5蝏mc2,r"ncZp|RF/zg:"wذrS Gi1) =Sr®!{ B2ϊb$2BɍtDjÙ To427ԅNiSeoOR~,H&OSlbO7D>E\ȧHkԮ)-V"gHcHpWk>v@N*,\rrOWks]0 SJ3??zL'HIQZ\Z52iAwn!=rE`?|kE\IZvM)niʪ}$޸".}qWEZywUTU~5JzIFvWӺۉk9{RwuKi%{wuK)v]Ivsw՟[]^=(GB퍻*J/H`URꮞR$؜.O'R~ퟏ/!} Kbe d bk-O?h֖M x Gu F%HeXFa 3LM@vX }1`>jzܶu7 7j饶ngK3 zZ}sJ/7nʗwHK닽gect#5ݨAfkJũ~VJYi?+g~ϲM|9vΪj9vΪj9B!&Rt*i;|VSNVjgVwZN+i;|5N|VwZVwZN+i;|& kuW>Yyj`fҷPeeӈ1Z|͚\o-X^Us#j֛5Noq7.ǗlI#0Qځ[2qG S )xD<71 ߊ.mhkqiBڡ/7i!LG>5e5ޤq3df:t4W w^ۜHYz˨5H@PR&crYFtp 2q̒` ^LCXh]hiic?v-vuFhޅغ5c+~{<#I(8XDtR6{F !'>")gx( cj(P'\#:g'yǍB-UlĹނY_O'y޶ҷ:Ua1qKLWD%´GK+J1yAԎ5t hX ӯ0ĺ}04;Q֝dy{ޙRYDΔhN&,E:i)TIҬa'JW~PPɊa?} e ą`QB߁[ @΀I$(SsDaBĹa򦻴=۽-e&m)C`^%,ktapL E.9)]\8FS1|M6g5.+,TyR N%sc`dh6H*`@_?;"ZItt2Uc0֛?>GA {*C/*sed LarxEUng8TmFd @)JA# ыSAkM19i! kHwf@.wv􊜔BY\ҞJe%'h@&Ҕ»5Y6R7h T}-VMxBʼn~4xV2nE3KNO%s6)Ͼ駳/௛o.}8}h^g8MN[MTCh86oӧnhяwi% @rm.{\hږգM|MWyHu#vu Ӻoo4Қ7Q:ˆ5K$͛ɇ"4M佛?EͻT̅cyKg<6FGdSG9I xF<58>毻zqS~($~ؼrxQ5Y e%)YF8½vc\2qxOpfΏ>4ͽ;L#lǖ[M1ѐz+SflZ\0}gN4P_HM)~":P4uSEwF_.[W/j|4+~ (e_&*>s ܒJe_=mӕopU;l)۝ fo~J6F:iMV6iTmU)v]_Drq+4g$xiP`Vv,FEkuiX1uz:YWcEKT\л ld"BP a 0KcJHiXRZŜ}L*BiJ L'N~.Wi lOf1JSWjV%# 7DxfLscԌ\ ȀCb,ͅbĐ.#esz O,8CJZ^ĶFnVu)-=×NwXn^^;9VtDHgBl4HhyQfD 6X+>W}%/22TzJr-K*ϛe&ўlJ+zH%E!}0HEc6%gh9$'`<*!^y$ IbPs{.aipKBT!:eLpȓ }؜,d\+ẖah>׸qmnm6yd\I^Kq1v Ű=ԓVhG,~$ů+徰i:_Ҙ Y|c;7q3CTZnAZ@Pj8Y 9*K#CN^8Ղgzʑ_1)U4OӍ<,^!k9ɤoQlű.>E>V6)9hLM } \ LfEkPZ}{ϺRb)ni^R22Sr*1Uض@g 6ס W_wg3Nz۟iBE|k{Lx{o H;9sK1c)׋977W%r+i,SY?B|Ĺ?2zы\'e,ɕ2Dw|M˯ [e?zܱ-:SPaVEw,7ZO_R'\$箠 ?AD ʭ p]2&bVbj%G5DL ]gb8Ї(V!or\z5n~qL|Nt; 'H{MW%wCM%*׭%FO,xyƯw92b`9 sA dDAVS&|b)]сLߚa56匮T*~uP!;gm)j;VaQŤ;C7re<]ymK?EO"-mk!#v7Sܕ?eQ7|\]Lh)$@ TKN,bPq-YCY Dy^X\\]l y^[O15)Z3ªZaJlͤ5.Nj 4u=&w>b/_P+\>Jk6?"f.JU4.8I}7eBXU-6Gš)CzR۲nW!>I_MBXƝr79TVc@րh3J.Xq {;}>31)p{E|w7y}[n>POS[3f$,@!ߎ+]1&g&i# l$#2v2K^oduh松:҈Mɭ:;} ў|[DQC{Q l.3+, 묔o+`%EO^^߽f'^ 7d"ud+`ҭ^)Nw}@q`èt㍃\n;zGpXu\ }~ϔB\( \k%Ģx0oj̇SϗPEL ^reD[kuƹU鱶 Ə6y~!0vc[fatL N?{^Ma9 76SXlr &PP%3h0.>e,v|J˜<ύ{؋+yI{=lwf ))L-. Fg3 Zv*i[j'-(l_ j$pFYB6`\` *e5=J8]MuWoש?+p{v ?3cqIZd@y%R!A'+-`\3t yMΩXA'}HQNL6Ш)8 9j{7t#gτU1IQRk'E%,zAk̥d5V[70rx)|Ca4T}/V l*yN -Jšvpk%JYȁv и[ڞ׵BTBF-b0Fv9%QL"?H9Hicjq**KEe+VHICRo*fvu`j٭NG1XUt㒶.|Y.z1_?*wި?ÇC?@c![s4ƍ\%X hܨ4z 4~Ɩ@9"j{4p%j8jԚUWo@W`o? \92apըT~7WȨ5 vn[Ē/? fbĮ6L /d~s'':i_AAk3}u`=Y@u40%8nZ?vnTO0v`-_ma'x߷υOgyP7'Qrrq^6^^geՂwY 1xbĚp?wO[Y<=_b?q N0w:=b19Un#TOK!Udɪ~_x[XR~[^ h3^+Lئ_Ղ$PP5?]Oek=vyOY]0cxbCk-kN=-c,ZDL8\_]'q3.=3nڶ[ ّĴCEh {g`Ŋ?&S#㩑t,MvEӨtjh^Ez3؀0#8ޣȥ.w8jyzWp&:tv|Z:0)r~kūZ =@~wU+qfՊ,1jC4j=䨋rK,YEQyo&"пSGwVDK@oB毯fB2S 콩+z!Ukcr*jU LC(FK FD0rQk@WANs"b Yκ-W f F8ƨXA5GˎT@q"QE" : ň(Dj^;i"Sd*ശ&!3hVubg}ȉ$rb+MV7U+]Mvd'kٝ*䔃jFqņhJ8Ɛ3`ƪKM! te\r'rr,MAMd?IU>Ǵ%GR"cfe4u]tdΣ kb#IyDOWE}dcn,r`pavl|(d0J^*RU%I*%Nbr\7޵6#EO m00H$HX\rEvdIQu߯-ɲG-2ev/c[MU_<‘MmR/cV OA:qȡA K}N:o㐪R-z k-jy۪GV?؄eK,(ܠ߂aADHFE+=w.ׅ!&(zr Ň(6: ox8.o7#DhP`p۸ȍ>Ի^}n<:܌;`Wone x~xơy׃c3>قs游O~ DMcvE @NG&FdI1- X-u"p$+{<*% (ކS#ģ\0(9D1%b Ogw&6)80"`) nqӯsD<Ƶ9{oZv +wHfK-*ASM @I aٯ7F8Fȼ|zٌ(]=1P(U;oSA41r%tYT1P-W!F!Hcv_ ,QkP{IΔt Sg B^٭>^l;[QՈ×=&yͳ5Gl&K{7ExZ`Cu<[Ά귮x1m%Z%$K%4‘ Ibb0J*.#3u({;W<d6{өzz4E$5cF9J)J"Q:E! +dڂAp݅Qg|c;qšK WNq{R9|0E#ډa@i\NR_Lw}h!!]D= ~780#,J(,AOI 79&bQ8#Ae@2Ѣq6L)իRj)8L*[?qwuVm=ّێy}doYˑIXlD[(Zlkwώ99ޓ.`:~[fTxdr]?/E:h1j7{T w1c[61u3E#wFKa3ţy,\#Ǘ-4<߾z'NP  Pe :zOOɂgR4{"ID L5;,+p@ip1k99 Bj=ӂ&Z,:Cd+TSF{T(́!]V'U˥o0 '̳'$ą'N6K~ߙPy]\/% x>ו _P- ,37|\Moz:2( *2g^^4Ë\;vPer.dDIZUpSsVgý+iwb_Y"8cQGY@QaMEi'țHAM, wVTY.+3&UH`E2(PR K#goYo@g}؟DC[[%.y$/*Q:D5Վݳ};=?ִ 53sE c5έ xML.MzoYtCRi? @~4jX3_Ψѫp[zݚr &ACZۼ _󠬀 +Y9";787ġ a:X!W r9kIg33jn7PNٔ]j95\Y=m FZ\[4 -bԣ! $b:m`3B["7B+}z)z9q5j8MdGD+R9m#1*6,W*2T=z4f`UuSYJR5I"&QVMC@*Sd@dͭe}.}>@bs"ᢍV?hTy#M6Ibv]ղᵻF`ݰvwu1f_,27Xߙ_u|/ia_'eG+QBg-Zc+R7˳b;[8;ZU*i|cPIy9O%Tj' :c*!<~WB #AsD'\V9(i1O-dI :jG!R"HȁS(qYw/L(rP,R9('"ܿb݃ޑh3ۥ|,\b;{Jދ߹9sj t^xߥTB !HQKԢJ X%8RT2l7]Av}w57W֟c+! A*cZ R'#X©h?Z\e"!muJ&F:Z"Ēc:p%!:2D!\\EG#,Jrk)88nuIL6;x9Tzu Ų=vSpS/?o.XgTJH*.T6(Jz w*} +v*}$ikm m7˅%[(KӦ4bvs4sfb1bt%ً[Ik;eꪩ@hwǃ>C22Ziٱ'5Bs5smקݶ|7Dϲk#C.=N ߃B/?.ꎉu*s,\{-9[+\>r边aEmXu +bN:vh"YnmB|6?4L3UJSZZAة)d*)aG~-8D[p϶냦إƄ !&+"52Ʃ s=yNל@Ruܣcy:_qќ%;V=>zF5Z IXO)a-uFȹ@FMjR{U5t2 ,ʢad24@rAkML(U" &n=Bb|rhK^;x3&Vޣ*^R܅Qx(K&a+\LH E52:Ř'9Vez `d.YtMJi +G.*[3nG)/ qơPօׅGՅΪࢣMr{ql͋h4 [TѨ R/s,0$B$ +EDC+}r\(KI2d3βJ%BJEj#&&,RD65v1rvkl?5dv)8TkZ[ZG8^\E)or4Nt$Nf ҧ`x_T& H*BFdHZсF]S!p\™8$Q ڕY1'`6FȊT4b18T#Q׈8>gI+2oSAYtTiK4 ,Qh@ՈPBhDqIiʑ@#@ -i&4wSk kbֈ7__GX/2OYKՋgZHBܧ;dG~ !}]܇[cOY1MiE^o{MɢH",Y"g5Uu=x`G8[9^0>[·{6`^lYmYqcdorntXQ'sA=ڤѧO rar*vP+8mXi Z%;_b]T1W%0PTu0F~l#'&Zj7L~64^*Y Z1#^}!N W{Akt:-^%iِuM|bӵ#>0@lVvA ^~6;8[N+I蜮%̠U>mlDږ-6 ;T΁8-ެM H)#\`0p`99HC]Μ=;/k vgJI!F54xRVRBEjK!Fa; d$ſvIY`-FJXjHVEebluu.I{l no8z *z*/A,YڻBeUr& şʭgneFkK22D~uA0Z^T7YqTOBÅo6#y!c=(7}:8*Ր.;wTVoMAc@<Al N95SʭUK1DQZ Ip@TŸ6QytTM9+2f-v9;qd&P!hU ! #n$:v٧ g{v?4J9_k뛃?<{|w|5Y^ ~ )FEFB(%@K@gw]Av/nDsy5WPA7ѿy!*Ojp&r2b x16Z"tvʢ'17z1OTgxbO;QpU@B4K5:f32jϰi-}NA+&3Frh] shEo6lF^θEP/l*! AL==]r,F}]&*]]֨4.{ˌ fU#ިF^ﺺjT2+QV?JEuըU;ڌ+-GJ{V틺j>jT+TW Wj-඾G};%^rGgG7k>NÓ*h#57?LO/fySY9C씚oF7GGjިFwZ;kT{jFT{ xuըTvTWߍR\zu}qu VW#W˪Qk˨QiwL]+5޸pA+2m,l_%snJ^ 8ǼK?jorչ\w;Z|Ȇe)\0כ]4!0Il$#Q;-WmVQ:)N ʈxbW#ُu|7U1e2頵֬s 1g1Γneqy""BҵV]Xd+C zqY{.FS&yj'b5U1RiMP4:6x '+0֥ӳ/"qȶҁK y3)L !]HIUq~ˬFWֱvuP 5'E4{Lr jyo_5瓋w.hiuf)wwA_J;p__ .  (f08őRdX'J&EFt mB[WjwZ]_ߕ׬tq{ˤowe61,!%L&WϖkA9Qߴi~vyv5v7ݯx?{{#o\UbF}|F}_ڊրb褼fhQ؎H륐g؊FݟFRѨeB㱢Vt(4 ?Қ6+~?LHƐwt1( dce`=ieT&lE˫LcFՒu1gkAN:aS4Z\PTړQvƤW9 R59艞2u)V|H1̐i튡ȹ6ݥxΧLJ0btK9U<%4`usPbF_T钯! bx]^f/^n?Ӹ`_N0}I4v-ѣ+^4?L"z;8k-r\.&X弈.bHAMB*9}1y+-COx:k~1>$n~C&$L'lϣ#|_f%]~i5!xY= :y"w5388vVk C~9_pP-߻[E8bH %pL.E'EIeJ4=Qf}ai"F~PjЛ~u=:/8jٟg~kPoTzaNe_?=^txGW1Tn{[΄eyQR'm=οpaMV_̼iomw+*BshNr4IG. NZ16@a@ЇF1dm3l Z->hݢKӫݮ*>7662 ZK|(ƚ\k$@ }$S[wRM z̯tˋ^&JYC4t?Bg|:Θ:D 8x颀.Z2Mq՘}[%G<{'Gc'] `cD x*gB-r0l퓥}09w i2O{ 06;Zv/o0+KUKRJLVVd&8.cT N "![8Vq{mGUU6J~ھdHPbLJ$KZY] ,lhrr6ȚAQj H(! ^^Uc\G*H靁k"/jA0rSv={=-Vo?oꑬi][H׬^araҳ˫k+U](b?> mtG࿱T~'Gw<:[|y~-Ƨ_Gv<fx*-p;#7χlz {eγ[mG\82-,s nWCwMxX1njXxL74?(քСZw-6\ Dm>tqG2?[D2 C`'cudˑ[rMݿygf?j%sf=^d!'IȜ0 c"1"Ef ИK!i&QJƠruᵀRE%JS@z#0Q=31/4&ŞWo0`j5n3kxV+]dԪ+~?:8O>SlelI=ifQPrY! ǚ6䌘dMpDsy|]NT(艗}*рIdMQ9 pfjD21%A'=d^ZId#sJl !9M8gޮwWiL>~ح,+_nMJݟ^`M]|Zs?y>?KLx*xA^@FJ-\ht( % 9 xl~!Mгz&ސ3VJ%"DmJI:k, -X c$&iZ)F?!Ot=:ON3#Njɬ]FYtٹ d)E6%uNfƬNZLGY~4J+}jFgw-C%Cۄ&Rڲ;B|v_jUA__O qg9+EFB0FڲNIUElآL(%g4L0 ~["Ձ܎}piSn/%h/S fÁEȻD[Bn{;]M٧Ag4[<"wLI%4+8_vV^]N[2xT!Z_/'ZJ m"lsL0^d`ΪI&6@$PDm#'6ቋko_s+qwDpVYLJ35gL\*z$@$0:c9Fث3C/psj7䅣ǪP\_L\ϕeQڛ|[}-sjR7lȟCJFgoC޿-,}3>y^\_feh_\eן4i健]w[hw',~ g~[`VgOկ3W|CeegFA;DOx6|]oD>1.O:ooWZih_*mLJ j Ω$:L^we%A'@TR&tQ(9Q.4rY,trV(EV;6P&Bŕd.dtCcQfKm[mmyZ{B6v٣ho:K{G "JEteL֮ǒuH<9dB у.(x}m0u .BBd Pu4*R(Bc n&v ~g}z\GuoνX m*\Mdޮac˘iiЄ>diy/eQN'&WШ:;TtT&ɠ|͠T#2(8 "2 -uT:%l$6Rme$64[RQ}jEܺ7Uok#D{]z6 Nh *Rl"'(dg+ӶdK a ( 8?k?dXIjk5Y+\l݌85LKaA1k4'kޮWVQvDxe}%a ΋ DQ|At5%B$j=[G zrti'j!R0؜ [J@c( R8ۑWifƾXc!NXx𔲛dʌ-OXYe]r@gW;bdtVu=0aYd{3EІ,$"QNP=JBI%+l*cA2,, v"g\D1b7g;b"11GڭfǾmP{bf(Dd:zO87WFǂǾh#qBč,,QA aw3EZ0*IL +Af/L\n[t;Xd8kV7_RqEdVe#[ 0J?qq*0&_gY/.Ƹ&\pqg-(Lf@6|RV- VdtqChhJ쇇}k_6wտqSLяh5яRق9O?4w){:I?**r_g;:pY#RBhEj_C[SsP% !ahtAmkO,Et:8}L#MJ$L_3Z V8tq;hr劳ݹisb/oDylYӂ>Y=3gƂbpkw&O֚xױCsN٠]p1a6iH~$cn$O4"y8{N z9&ZUɸ}Aڄ"dvijܺӒQLQ*fM!`CL J9!$h$uw+qvR=^Y2Ɔ27W ;$g4KX̮1 Now]KqS 4eҨa]JXI/#!e I4a(*B H -5T|D bR:R{ˬP&G3qvIISk0]AlTт!%D#I֦6X*4[K4&c01e$9ik!]WD+_B(^N`yU%zOn"Nјt,&eM΄eB brT; NғTPN:6"X<'~7FZ:UA;܁ f7w^M=س˫:_t_|[<-v7tNuuHvl Gli}c;gf;ki?/yꃂW{)o=}sѨ2Pq:R2tS:M$)$Hc q4"VZ4pSLK-y=H*llY ݻ{ۿ%'s$Vr;=nBҸ*NCJ) )bHS+xCx3UVDJVѼD _=\0a«'1~eu4v_Xiu4.הY+xN]9p!RWİ4ͨ+^uEj箮KeEu`e0*]ԩC֏LXŚ%y5}XBakY~YAW ERb҈J QV$s!J*x.vϐs-Z c?߯)m0k^{30qƳR%)kк`-s~ϟ1J~Rai4&avmˇzяU˟4}y3Z4T1nj~3+㹶xM^y4-W3QӬIee!J&*o]2 OH(DW*ZHhS2!XT J0I:iQ͘fLҭ& )$%(*KmH3$@ @oMLGf1ZN$l'NJ AqbDuǮa*?6Cnb|Z囗Sf }VN @,-SA"5*#tH )*H8c1n}踪9sJ`Щw9 'iύBE U̷֪-vGw',]fW'xs.Gx_0<1G9L[J믬e[կ-uCQle\ԕtU0괗LlR 5N;u/F7TH9@Nia!1*^xQ^J]:<ًFX{rlBΓA@)ȂBVFLupd˻ L !9F$6>vvc8A(Fah\N*ڠM=TYFjB`qrT}Jx]a6B"7qZz|F+^Te:P]Nh" ϙUBQfg9nJH'|q15"DEfd*t ݅T8RWp4yeuQ MA,##ӑ=*KxY"wJ.c$.Ej< _a#ś4L1zQiw)r5_ mRD?ÿ> 딽';FP'b%%?yxԧ?Lu9'x}z~z݌/)sV\JaE,}=`GH2~:LAO?P>Z)kysM1|ܭ#2,n}Yot[Ua ./mePQ@ӱ4ѳVγ(Aqx~R%˯r-PI^Aoǽ?!?% '@^|Mr&-$ų7-ۍ1L\ŕ⅟N& .3n.pŵO 9}d GtP&2cqvqnn_K8cWi-yZhH[E=~/[ُJl0-`B hP@5pK"9w"~|Ms]qM؋ؼ?&%! a/dfTd~x*2\K􎶎"/k3yݖOj[{ZB:{-$k֚oܚIGH&& =C'"#냰QZiV,h4Z5ҰhiT9B8,#h9;EQ$òsd9Si*R!d9:s+)ξ)uPcTDԚLl "mmm?n\ \x8zA9a1*m*bz+t5{3W[̛.\z.Ew]^sy8E;>||ag v@#r"M+jͯ>Q6n|%d&1mS51Tæ)Xo׺jނQK*&֖i#JisJfHV:Ƶ笁:;vN{ng=i;9;(.!Ӳrڸ~.b֭;|g꽒gߗ]ƿڰ) U+T,K]Ԝu6܉ɮJ ةɳg ³E`!qc`۔93Nȸˎ !ƻV@ G*fCLAr),%g2'T&lWxvHk4 sulp6;*U?)wM*Ra=Nuzw|f.psң֌)]\k^֦j!tN/ svF_FWtg&H#+Y-9}G (/nz˫37yBHϫ_=@xQ'2aSZC=Nv:w?%ԦN4:Ik,r+asg,c&Ҡ%CԿz:{Ö2=9-A l2-k[\<`Oŀ EuwihKW~(- -Ffs#)ܙF|UH-v=F;w5 eyѠh:"hRg_-![7r n$e*6ɛ&xk;yoF|A[1{켈n#Ӿ8^آ{=T{<.MF"e3f]誝t/8MjJ?skM?9'A})#2oi{&fJYX`Q[N8.`<]wHKYR옠F+ VhLj 7dS05LYxXC6 8֥%z}VPwxڔҡvTL;ƒI!p9; "qn eC`2d!̔vV\3 Q/Ŀs=ì#VGN9iD8LLMwa<("1_Ug;58w^fzhzI[@$?L)R_'fRO\V?!۴\0n!8U)CPIn]e=(DJ7EU[.1u/fK=REP)z #ζkl@z%I",;sB٤`Y|ҙ04F dӁkZ%"2"1X+*cmwil 렄jBf)NKםJmMNE2J=ʃU{z 7۪V>|PDӖK>rt%%%:'t䲴V>*!"'59+ - ՃFL.TRr`҄`!餹Ɇ@#J8w[ďo-'8[)Ymk].6uǓ)ir09GaΥæZEdk5]<]=;յg`*vzN)ےZ#e?RPT=?J89&:&>|B ߱$dVΒѳ9.Lt]Npqs!%ܣ d锢R,D ] `91!&cdebv#IrUmBREb*jN6%.<]z/ޟ/1|+c!=n+h^vz(ix4^]VoV#@0GsnoZ /_ѡ^̇zQf\fz lyBo%+r%a~~ sXi='%XL3ItY&ҥg)|M=2{q<LJ/~X8GS O7t6 ] 7#k+7s kw5т>6x?\ )o]Aw/b6}u^5;OEߌoHe8ͫ'K%p靮Ÿ.':( ;d'Sy1dWW僨 \,  Dc#G.1f~ U'&̚apR & ]C sb:o| tjZ=ʄŪ@kD!˾.g' rL6d 2oGX Qs=:i5"`6pˆC QCBI2#>HN-}0m) 4 :>5/Tb5"rnNɁ١)?&rF,yteno.~rCߔ&Q:Aap2( ܸ$KYhWWl%瓉&*+S?oLOgG2G`M* K6 vpYr5@J3T(h&MuOHDzF{6rJJDtLD&K@joS2Vk$s(+Y5qvԳ^|joɶJR7#w^KCJ-*P8Ʊ$ 'ǜ¦5Uz;03\i3BheDc*9 E؜m1\`pCRZՠV8k~R>-U*''9d}Vr6it2X12E#r \3U ,]\ڞ؋z$sbC.ǴVF)9O -A:) F1ұm,X"I@$Mu,6m!.+}Ѓ'\m_UbwPŤ$L <+I<B*ya%IۄMnQO=V`R7j;d_? mnCC~HZ@,Z#%Ҷ`ԝ٪Z3M]G{$]t"*(^:)9c->Ś\oL)ϪY?_YՠӋq.蟷nշl>]rN/\J _r twNZy`EqN+xD<71 ߈.7MhVC7o=L.bo}Û~hm.h\9 ?҉S^F׿MݛͯwUdman[ GG4ta9rE,ggVPmր|([;h}}6HH '҄Ƶ~CoiRUOI4)RZZ<Ö& + ,eT-sUTUV"\=CsA[O\xfU阫"¡"l+(|Bl3X'cHZwh,3L]mеBb+w}u|rG3JqUbUW?TIҒv \)[g7s|̕.Ĕ77hYzy{29_隯^>lk/'$X\h[o0W߳G?zgܸs8U8WitMsfM)p1{mߌ^6H˳>L?<Mn4^rX2=>18 4dfbt48nL` 15&֘Z{ckۘ5&֘XckL` 15?15&֘XӶ֘XckL` 1=c~ 1لkЍ 15&֘XckL`iᐖ (HagDYzˏyu@g'/$'PW0.qW,76]7֓\uK%))HSFk4* '1BL9sYF ȸVcZu;;YBm>}^XjknLiRArv}(n֡'|zsk hMn[?ϋ ڣ':OJ?WrqQ?z]\`ee\USZѿ:>2Lx4wa( 嬢VAt7o&ݩ,Za!iz7̘E9(OKŽ~^mQl>P1-W^ W>>.{gR.mqq)'=7fBe ԮBnmV[$ܕH[$ An}_FwoOs<㦫1wfl} idG#nWw||P;ym0Loyu{jλyB^nͿzǍK,C1zKjk.kJ11wO3xV7|ʧ悮0:ւ׏VƢzt:oc6qcٷIkNۤZ :Qht{/Crrԗ E:\%{ :2nbߜiu١rϠ~ RKH&Kd9Y0D@ hTFA2 );\]˨5H~Qeɭe Й Hmd:0\fX9ZMSBG4D+K8;3ڏqۃ NA;r|CcV:u%P 2&qVXÄDHU6;SHʊ\/xoE8E({yxk(,gR-/́VUbI[Bs1r"="A3IL FIeQPhz)<& ==|ncQך1#R R%H(UQ2 md{) Z1xRJi 8p2*I(S Myӂ 1H9#㒢 evR**ku3`ep"(h`AqB)ݼ9%xq.Xښf嫯Vn)nWXhN[K˲6US^U­ݚJ۠*a5<0jDcy:+^I{V֑ʢ|hi7K>Z< G~D" r`!L58*NiN;=ꩵ5Kä֨eb`p&) -On׃/6]~oP V0u7YSTYF~3p8x uIiMeW0B("~ 76 KkW*%mASK/oG"H׃Qvkf5h;$૴yF@b94zE$$ E\eGhpB8A Hh"&DHFE+=ը8o|a=9oi'{Jq,#$.`ECP]*ڪ }vY:93iGnO[w=W,R%JD5N%$DUb*I 0T,O{Dߎ-}t-1"iX9&r궗g})fYw9\Va۫o~ɴ:a9|t0Gw m2Gy*I3k2\W+ZsT5 u@ZF7ԩ[iR=J%8* + e< 80~n<Aцqc_R_LkhDbAi~zDՠG X_ZXIy RQ^2fj2t#@MK!Dž̤&2]S~TH}چЊMBX3[TH?m 6{Qg/aQ<:n+ [c %noYYƥZ[ad0Cxof]"ɚxׂKPDZ/_6?*u'ûѬ^?)\KȘx߰1NU8Qwv8Y[V7?Y3"^/_4}x-/_~KʹNx489?O߾Q"wSxk~IJjOK2iTrq sĵYS#e)ѡZ g?ud拫&>7 Nδ̃4 an(Vxz-,cB1j5/vy] ~j>VBW+rB-}3fy|ڷw 7sr+b8xMWNN< +}J)m)Ql>}u*M| |EJF+պ}[C|Gu>m/qqFYCsP(@Rc;)isbA<54%!:o:ؒc6"Hg<$`5`UqQ8!('$Rt(V9>2y՚ |*]_=ۼ/m}?-߻4_֕ޟL,-Zi/uԅii/ыa޼|,]iS,Uz'A X"uVHH$*'U!=e?̖x<|o[llj.ywhv6n, x;೉;%D,^\F'yvF(qK3,ڧڨI֠ lþOo ;nD^JٻFvW|,s;b.3w W~rm/bZRN)[*[($? | i z?w;[MxH\!xbjE|lucvGuޯKQwO6pJfu5+kGvXA7BJ4ҷX!iT)LUV=άURbTNgYΤĬ;6zdi4qTax./Cd5-RK%6EUJiV !V5jbWrH?Dګ{YBUTBclڧT*eoQjS )MymB}(r!ڒIDK*T@ s)zuLitY14گXJ2GѪsѱd D֤ aұұ2[[\dۀn6xų1b;i~ɍo.fnT  LRKL+.{Jx 9y-r([ح0i8 R{/iwsٙW7OWlaۛz]%u7|NmN|ܽA7tݷ?7Ju`N]T>vPvV&oO hb~c`U `xЭ3i֩0DNsr->bLq>B5b75rָPUITMyl48O -xg6I&Au6ҭq^Qlne?^͗?$jəJnSK "HDd ڣ@qIс'EyW흰zf{yW>܀o'Jx6_ӵ.Z,?C!:]-Iu$NIM.L^̞WթW.;qXld"kkPеĂ+d ܣDBw,!hEy׊m^y]aJ^l:co1ńvA1B}Hӌ Z avޞ=VM{ J|?9;:[+u/4:Q/HTf݂ K~'Ll߭tr~*g}m-bjԧǰn=_OG"?н [:wnwB7`]?9,!އO#!ǴwPe$9B ʖ%f9Cu5xҶ"""ҵ'r燢|mxdj;@=ۈYȘQ v5RuuSu)R&Z#Vl:xq nA;F!jY;G ~5E?˶ C5op$Z*:!aJ6ydoQ#b!n?&WBܤ7Y d!6[g1>39X>+K'.7F3J\Z: ˂,M _VL}==eF..(Jp]PSt"1j$(MT-'D* F\ ]>۴ {_//mnxCȥwG.mzN9Ysޜ_,H5`iM4k e :!O.ger\STzQĞ_^5r~[b.P38>y$ْ# 1bW3)I I+$:z9%~"Xuɉe Vh6!Zj9*EV8c{ -Z4S *(ul~o_9>_/.#cou.Y(UטVS{NsŠX=g 74JdmI9\ lUƘJ:иE*Żj)jg<3E1T-RSM%ߪ!FNMݍs])n^]E켥ЂӌS v9_ߵ10Gׄ}2f1g '^çYٛk#ol UeRi@5 }68$bO NF/O/9?1혧bBn[Omynӓ|op}yekCvh~'&f*\2-w[nC&1,SWosMֻ{Yf{mGsO/u Ԟ@UN5?2.r_Xբ"EQʙM1* R uoXg]^Jر6~,moE$^ێڟyqu#)_nҙh`(e8IEv!kŦRsJq{m/^k]ca8>à-U$~Jla.UlZ(EGR:R'HAgc"TN(ʮ8ZP%9yyBr%ꊓ4NϏ/-k=/XG:9#LնF:m1urq֏wI{!١c SB= ٝ\ r$4o 2읛q-N|DTʁָ¶X`RH%N@Ψhxo;]:YVhPh" Rc x ƆC4͝Nu?^ $^cl)).99_,sQdvQq>7韗xu$ك$ŌFg=3j fָh'(MjTMڴ~B磹as݄}R_-q/xASÐym軎^هMfTu]̀uq}òM~r0ͯ?WpCc4oJæ[y/{*럋m2\^]uK{K|N?Q>~~cKޫa!?te5 Vq|2߲0sOg{aO}}"#5#{>R~jEZUK1zhPˀZd YeϗdU1\ 9f=ųθ{tb5"_9!n21;G8pJJbID))bU.dSdc2qq?9ll1+EIشL)2ڌ- %/6e,q^ہ~k#ىļdD\xk9J׊ԩ*uލXtn.O{j 衋U+hCQiTAk{9a EN6l,r֩b ZapkaS~GE!y/`nk|Y~(ÆٴB e٪qHo|,jjy6+2 tÐkR7 b1qvIJ9X[ Jqv)&B_PkH]HXSbB+B%-RլƦM5x#zz>VUzWm)ͬɳ|#RGħp4|"*4:f~OiZ>& (ק;MOc0Nl`F=_uxAkPHv?B9= Nq*\5[VeJd-+X*ȴNщ_WK4ƩOٺh\NmVI31s`+q'+bMq[kz@R~Mɑ"+?/o^=u]{oɑ*sȎԏeq"&@G3ErIJZm~3$Eɤ(S-ޥ<{DA,ɀS-Dіwѣ<e͑c)ff[TmZm\L ŗ]:Ea4 crEG1p@Iʁ{1ǎp{I/Ϭ!B"7ީhσhBen[F܁`Rv9 qiO@ϵ5=b,'&,H){pү0g$Jt15ZhɨU80R]F% DiBYKTTs@QXnE Iˊec!2]&+Mmz|7uf.%U+I|bg=Z\=/wh5'A's/LY3bq$=|X..=>TͿw.4~̩'S'dyd}th&A3ombJb-)5~i'vz+2ۻDRD8Խ%1.8FKe<)tF1\;?j҈I]>2#s:(~`[7]DwsRzWgQpv3بo^Pw[EwE[%{x6ZUsw%?W?M3T|"%=Ug [S!6}%bOWx@=1l)</f~{~}[gG%“'F =Gsh7EYhiS9 i}p?誣?0*eZ@kцbv D]`+Ho;r*蒙6{@ER <P$2Y'2NF"$Ump#jE !\} L2'sNzW焻i;U}=I=Aa.3r;<&9Ag-X_bf} ssB\&I!sw3uSqzIl%x-G@LbTA` p3*vr RR~R*%РZTV.@kp6 l}廿yKwGXȭ'uQ P3GXsI[Dzݕ`2XЁJR~bUXC`Vy,iG\hU%+T `Ҥ)yLBd6풱V& QƮPUpO$ifvQ ɮP4'jp6%6S x!Wl)%yT^,16BUy6>8LȐiQ!YE-SpĒ|HY%IFke6S t`X>vD4x%B2D66Lu1hYa.)Dr\i?}Tڄ5 &}nRpUmg񬂘^}! t%^%b #}]]Rn !~ql49ȟ;hӭ}/ee@DCĎ*[yk]ϣ&pܼVc36bٜ)+|>К1+0(y 5_y@nBgtO>nqx' Z:`>Q{CIRFۛvaҾ㾸{F϶>۟iJkWiոwwY>)!=M?ƛ6D'WhJb thilСqC7IIEnh"OLmR./e3"LKU׾6"3҈I{qeG,h% mLWVǣYoKC*eĬ$`-)cVYNs:%OPO^7O!>qsMkl , wBhZB(4dtL2aloaWƆ-"Xak3b15&U*yQW{ٟR2ij_~J[gO-&K\em֪$\,p)BfLBs=앲m-7/Z1SƳJ2m=-|0(VX+[,ѲL?Igy";2EnB>~o7Zx]x(Zҙ_t#L[$<`nu8K<9:17sE"tlCںN-s8t~mܵ>Ŕw?ifщtИP=MU?X`PM`Qc.&i/|O7/>$a`zr56ףrdlG}V+T[S=^N~1T0+Ck-m·3a7t*tCnsPxn ]a 6xEwAxԖ P&QYW} dcJ`0ē`,w)%V^7+ujٮL.rx^IF.jvP'?5VWpJ>cI̸ ̔Cp tzI:,a LUwlBZοsm/zOA<`m =c^NUopW)>e1}~+h1>+vmZ( )<HNrAy"|k {^HTe  31㜱;.]6S^]EO )W)G%+vGI6?Ť 3#r| ALv#Яnx Z،75cm ZHA0gu@a:w{weDsG $[Dsu6\ :dudyncx=UK/~bv5`d"*ma@e, }8 Bg.J 9DnӑY3sa,zS)1e<*ݴʝ 'lD.kw6|)a)^v |o,_8Z{VA{INې ~ ȕm7#Jo Ddd ݇feE& e=MZl6K)!ݬsNVV]v]}~)S`1jv/OIM(jZ4Y4Uh}9RʹIvN)Xm)Q2𣉗}V7tp KޒvKt^! ч)H{z/\_wN=׾ޒ>Ƶ˷}[otSF?/r4Kz,-9O577uJ4rK{jy[#]7 SїΘ;ͫo>a~O˽Lp }. >p7sK5_q%\A՛s5R™u N!_O8kfv0/7ɢO|'[ Γ78~*Yx_9ܥ7ϜXơ}ЇU&ޒœy2oxɼ%k%cKVF·Dc a'S>r5M鱗+Xiw{=\a]HO\`Nɔ+ zmu—'ad~훞İVj*dz^*hgP/@:(=! ǧN>Mhbca @9$/W2X>;oy󽬍3Y:9ߣ\s:|}0Vr3l?OOVoV5oѳ5;ǘ-gP%d?_ ?>C4Y:*>&1ܼγ3y/ ցzm\'R̽j6f^s$PEfNNQIEM 6'X2~RPQ?s3X6#}MbPST+XD]ԕ,!Zr4u 3Zjg;<[ ^ϹmNɔj5TK(j!}.vechZsb6;m5쏚|vܮFbŇS>0XصԚfQ]T=xk5k-IpqvT!\js-"ޞi%x,ɷX| f^_#=r15TR6\w-=CtnKɥqGx ¦?r%cS9=dbrhv؅~o/=~Nn22]}eˤQǥ8zC)&bswǝw71sw?v9Hx# yHë]vj3Pjw [88p6,NNWy1o1o=-3y8ᰨ Q7֟/s[I?:5r}rcv{CUVmȉ7-{jC (W$23=%oWL֧Im] =f{ZY`7Iٻ*P%CfjZSvǚ%,ԡ) VȔcќ+ջqkr_clse1Hfg=oRE KZ+ikPMѦ*fZ050[߫՞5yM ]>%O qdPXꝹ|Z#T6L3d7%nSBN"<]k#2F2cs"1;f0:TS/Y`m4̐mOcs e4~2G40N͹cPa 2!asihd{G%uQ1.Ll}G&IQB,0N&сEg=]kiDZ+1^U2esTBSC=)$BHYȪbŴ)E[qjibM)\O63DpבlKÒ,wxWShZĎZm4E"~W9dWLH/FM$ƚ  E`$ 3Փ_"$q# [qBP4kJKLL7M`v/5k3K(2A묱XԳ!h,$xڀ.5N03*ˌOhZmv(dld P)4$[ aՁEFgH0ZCǏ:]mqDvE RnrÕYDX<\-@"[J3VZB64&oS*z?s%R uٷX`0TxSnXPZ6fа֞]C,D@%*Qy]])si%l\:Wo|C:D%T,mlPeRDeCpuMiIcQPbPPk7cv4V2Xf0 #4;SɬGe &ڕ<[@ȶT ++ ŸC@Yah4֏P`PBdEsɆN=qqlG]n (謱YԑhMSH1I)9$P;cThoa˸T\:ɧn p C BР wf6r4(fA(-Y2H6o@!N)CVW!VOݢy7nIcj} r>/]YbF\jX1N^;ˀ#B)!BMHȿĊ{abg?örv=bl4?tjA#}/k݅J%Zp2 7o,ui!N`YǪF( (]P{[bCU&2bx .dX7^mX4` "!L`W#6FN!C7IA-fi#zl>|J_ǣ r7O-2u b cgX>k+XC_8P秆`إ`9_?,9 ./ٳ ?Kv2wv u>obR'ེv`=HaۑS#~TWpU?%0JPpq3Q/{i8UpGOg^'jvGV^8 ޾<\Þɚw'ǛQJ#LFR|Lknxv}zL5,b06 rmỻ(궩_O7ɻq ν=AaC0@awgx熋-Ҭw>X<*y /Cr! *h^FۣR]慝0oxXP|q9Ưk omHvQmGh;Bڎv#mGh;Bڎv#mGh;Bڎv#mGh;Bڎv#mGh;Bڎv#mGh;Bڎv#O mWD;Vn_.?{BnTV nET岄A0f^'ۙV菰O%9A1GlPcr~PEamg(Q(Fr)>g`%Z;k>h{9ݛ8kNGK|fu z?[ FÇJ]<(|ޤu5u:u_o}߿w9tݦU Ѭ;m}C7=o<3[}zL;w8wZ~HѥɸZAw}תzUӮw m6_p܊8_xe-]?i?C@ pDݟ[ﰝ|V: >Vl/v΁R9Uԟ=Hi4}sd(ZR- P P P P P P P P P P P P P P P P P P P P P P P Pp2ʗ5}9+m5y9+VgWjwK_ Yod2Wr6Mގ7`ޗv5qFՍJ իM wT"rEOJR6kݗрOq W+c@^!8g\EuH ď&yƽbaVAI xy=nDlj U(bEIb q.OsX-$1W0t&BXqʗK1$)I 5H/z:gX-DnX^OgtnSrT &}񕨈"SQҹ={tnѹpq3HX'_F;Yi$BED-qYlfHm X-u"atW>zMu3Bޔ;n>FCB94>Nvov0,Z{_n{> UdQ|EQ aRZ/ԉV,;ya F\1@):7)SOb.7j \axyr(yO1&QhQp &͜xS% =$f%>=9n>z-|粝L9x]tv@㤫N^j}oi6O_ Ob Q%oIIr 4`T\F &`E_puhxz2<&i5=hy>5c/\F*UQ&)2%IDaU9M &JfSuFE7`dȖůzJ~E2IrN8=9sϑDưDi\$U۪8XE߲X]5q(1);@t'#^pqAS7+$Ĉ) K?<<<6!<5BQ-1,vo,v8v+{t8fRQ8#!3@y&)j6L0[c{ֿp~s؜s~4:%E7:Ӈ׃K?ҋ&IiZ \2wyKN=$~NMPKuK-y=[IU>d4 ;}:/gVppi;=g)_d{o?vx_dnwt{LC{ޭҷ$qaE.KS}ϰuiU7W~i#g=Cvy~24K%u64-jW|مdv_|KüLek 5Gxuu">is:rTTm[b*D't"4!fJkۭ{.,oRlYjئԔ>\kN;\wmbX{[ SJ@cD*H)I@CA}Ys0:PTwMxU7_=80L˶ d<|p^yYN})/K}YeC10漰|бj\w| jm}+W=gOZd(4n ni;>kCQyis Z87Ӥ>{O/m #d-5hoe[k('t]y{Z[Lr03Q:j{/}^-Q_v_s:0^w=J*S"JTB Y9cs8,BW@^bIϴ2yv%O*m|,6sbJ*xM1! .`+RA)! >EAD\vMysY=,knC)څ*G⃖JB} #h,T2g9qc.rF% iBd|}d{ g}x6,r g/v}== j)j>qc}]*j8SHTHTHTHTHTHTHTHTHTHTHTHTHTHTHTHTHTHTHTHTHTHTHTHTO`0RE0hzqo`CJq7 j, l}I# V/y500b(y#AϑyqaU#7];_K >g$]u٠*a5j\u9h/no}} .KVąB\.N]}.r%ރ?>û4P^ ~]YMʜQ!ĺRBKXSo[!0v=D\ Ѝ0ruE__q 5ڸwx-1׻JRMJrΠ`E!+). 7[?/`H#[56 "b! $R%H6iB[7@o*VD/jz2_h[EݢǯV?mA-^E;jQuWcGPpuT/B}\ӅDZܟگPvsz릠 7hԨWUeᜨõ.p/ϛ_qї ޖLenc6I6OzB[=^}Zox=i?"#ɲ@Ȼhʰ K╊ LOJ%Лa۩}bؿ>9HXV+U`QR^rb1g YbKz^Niq7=S}K}Qo:Mv:J͛tۜ?sv3]_}RtaK ǩ5Sդk ׋C%T:c*!<8S//YNN)ANJx:$n~wPtԎfB>FEtZ 4$͡$$'bk8ثI~Rov 3 3E *hX0w#y$R $-f0?^^6Vzv:9w#IROšʯ^s% Lab6$٨%XQ_eX%8RT-$$Q)$F#|>#IB_ǹKlp\p],~)$;(&)jJ#{ [8/~骛j]sux=Jc}RZ~- ~hsW[4. h|FwD~c1ڮnNx4xC,"сWJstjcwcvn|q3ސ)[ayxI.5 (n*0ҁ3:ISoqK]-j)Q XjiNz&HXDr1p_Jq~:4LH[.9256tV2 )彣ގ~JeQ6Q,Fi@6A`у*qKfr?V;nϻlM_~ߕ+M?m|uNxuj<|l6"׺2|bL.᫷{N%ؼު94y6 Di%Y x(,bU6>U(B&dȡ4 ɐ-&J1fUh[a5qmPq&cGzDS##qF(pZ!|(~-ݙe& cFpӁhTrVzSl\p,$ڦ6 Pn=b5q7/Nq)YMK__V'+T^Ix),\(x6kKhA$nE)rl!boa-JV=k-ìWW#w*_eV'ZTȆh- r~|&xuc;4v}(\_ߜV; KB6`o,X-h) |ǯ(D{>1 ǘĶe+E?R€RbH -j3 +t kC/Ը/hQ;Y)txxޥnt,fF!Z>,>^ +vw 7|^1GThTSmZoCNպB<6t@j  <2)Q LBd֭䅸Uqg`Ef$E:87Nx嘒L@D$\qg5qv λ\^}p_o 1{& Bː&[1-'+b1+J}]"w3SRȆ~ Ad)k%,85U3H={' kĔ}B, WdX:k|Rnv!ԬCb^[zAVL1L ښ}v"EUƣS<{U/xq,FCWMR hɪ ZC!3czdcu 9l+Sg Z4Y ҁϔҁ4_C 9I,#Je ڟo^9я_W^E?^*z(lIj%.iߟ~qJqXioK >NJSi]LVΒ$+2ȅvNYd~͖2GkA[Ґ>(_ELtlO|n+ݽk Gp^~>j~rF_Ϗnvn:]3}M_0q]ze]ʎ_HGi6^z9&<-z@7;ii*Ǫ&!8keaE2DMBc&xN`:1yOJr)hQ1%%@CK)UKaQpIϕ76+ŅʇBK,+QYjPA;["Xh00E͵y8[B}Y;k#m}fJy9{㼥ؔۗ1 e|_'!R{H $+ aNM sTxz|Sjm D>|Qk602,䳌I:sQj{HJrQDh}v3M wU/9.mz?R_ zE=79D'14`NZG@)*AGtXC{=נ/z@:?l=ka%)-x7(k?m'Z#IZXmL-91y\Ty\y\dy\Β^JeXJE"wJsƳ,+IN&e]]*71IE )8w)b).r\AAP;[M I(茯׃:֜/-dY8nqCmON~Ԉ|*F|N!D2); X@t }8ņ> NbBbs%.ic %OyL78Zb8ky㣵WytHA"GNR$! ?ra-EQ=~*ѿ;jm˥-5=;~5P'gU͊O{ z'FFov͟}l?9=7͊ǑbFw%{4{{y;?,}_bx<֎w RCW=LB[Ode u>2l>+vMN(5cn8 s̐a;P2ew+xW^( EX:΄9c#w\lr&5sK@ $*$R=')?QӦlVx˗C}'Kj}WT>ϔR痓4h-ig~ W[Xʙ X}q=n6&d{s?N,4nj/k;x bQe.0嬢V]r @$}'#+t*F2/ T/J|rpDB4YLG4>3|cyھޭo_f[.@Ou 猸q߇mǁ)@z[tn=Э@KoP (mm 6PJ@i(m_1$f'/<~K)_E{_Ky|Rˁ9Tz{R?^2Prl,zɀ˜%UPFNƇ݂F|fqj5+Z6# C`jBh ld Qu2j $j* sQ_1|!6UTf+!XBlLccRlz h26셐¤-1"EFlnhUؾSf?ڐ+64x08*dy!jHŤr>H^Tş`P+ :cI=9LcN#8NOFrg5qƟŇl2Q8/g9m\v]=^랟a;UneR\jpQz\eePI [I+,yL&|F,6r X"̔b`Yk w2Kf 4sIE[ΐe/ a'EN܂A ǤAdd6%c$wUgG;k/ZE8٦t)lipsLmxҐxBJ)'~& 8/I!xiZ+Dr)y腎 [1Fb)pzk!ٮ&)4Ӿ_50Ʒ`uII v YeE1I!Jr#$&6y0zcM먦!ZlkL^M1߈NiS) $[Rbt#)rKH&D61J:qh̟4w I5U()%6* 1cq}ן]x"-ڧqko8'Kk9 0Z\bs At`% dSML£.o mB\uPhZܟOn^(ZiT&24G3:L74#ԢW%^]7 }76o7 ]>įcI> #5XqvC"}S$I1߷zfHSCrƉ4kjz˩>r9s48rk UmOvJͶGZ/,-H6sd6At:]6j+'Z"Ü˄=6@W%K05j׋:Vj/z g=zT2~2հe $Z9Pu'#X){˯Ja"!mq?0UN0H^zK2L)Gcz>n.5,Jhi VJg)pok:%*EKZy=}X\ɣyOzJGy.Cf>^l?WO:kPΡ^qz)?p\ޜQqZ)Sa`" AQc#7}ygc%(צOb-|ʨ>jS $m{b('QQĉwHD3hQc40+y-u>^l'~#Z<\er QsYw9OW@ϭx1m%Z%$K%4'O$#4`T\F*pE ^tx&ŇثFkoMXD9Ɉ=T)E R$J@'C(~DžCP`h骡 (Ճl1Hq\xPW#5,syh'6Ip@:ѯVc_ uZ4ٶ > um] .#3^B\^ 8D"2bb9p9GK癐؄Edİ؃!r9&bQ8#Ae<HF;"=ZQ0ц)ʥ rpZCRl㨊EAY7O$RAFKIQv E@Z݊W9dMl#K-q.C&p(rf 7j8y*y"DBdI^!W܆h7$ya-EJՎ-DҪv4"@`[1s6.m5Tɦiz]H^ y1N<79 ~+ i|T@WT3Ls^))J7M? 6$\|BEa/'x}5X="X aPeV=,:'~9Kui?o8%5.`_WTr.@@JPף{Ϝ}h~dvXJM0 ~, ~.eT~EξNW?D#(Kzx㦿3勛N9} z޾RrΨ>Tѳ0\;*|qxyS&c8w#7%a?X}yr~ hM=6n2>d+=73ݏztW=_ܭWי>˳GUQ[7/ϴ~ erU~W-1s6W7:\Y9S];.N}*\>|D" &PTi?O{Aj؛.VΣΣoyޭ>ٶ\fSb}8x U>u_w\<11s\B%\ϦؓF9#_RO5x:{0A[TX1}.y敘9 :rw-3̗$Zvd3]8{'Nz¦}Z}Ӱ aAYcn:zO)Y,Rj S;H&0g ^pgGDiiw6k9˽ Bj=ӂ&Z,:Cd+TS2\XڍUqR;ww'zUwJ"} д.|@ k~z:DPa(  2g+W-sqzs e'Q?D}TF%N( ">r.dDIZU}$SsNgyюU6C4C,C*=ez$55 8`F$X)&yi(qIZфnvR**ku3`ep䁢 9J.xwg}8M OWoUuMQsm+z,K--_^hm~,OnUIQ G B%YYt()[YwT+ӱdpGL,\/G K0h&DZ w4Z@)a~DQyIeĜn"0WXn)6q ֒ aZc˅X!q֔j\'P\Ӝ`kc?+CU4_~}DmompԻ䵒4ިD PCpu1A{k%]yV )ݘ(pyP ?VsܪC,(b@I/n!a%~'qΤ1āY#"(J[\_6x+aR,!;Y 1#uY?tnb)ٍs=g 0Juܺd({"k';.&Ugz9zӭg|_z(|Dz?>1&g7pWXL7S+X!;_<ǶKSCW6H6~2mXڜϊ._blc957zE^bpvݚrsFRo.|̃˜D/Y;r(;|5 -bԣ! w@R lgWhkW;[-Je=E)T]nFΆeJrx%zʭQX~Zp/F,-F^7usC#@ d^NW՚鲃7z&#͍63-Œ,Q @"dލ,/k|=BsUh8#NъTe @ȻhJ K╊ Uh=YD;Ezuщ,%+*I!0rmRɘ  Klr\܉;7==6O<`1EH-D`2j}?MYŬ?Ɨ WAN":c !,<+!9e}C¿ 8j'Oó :jG!R"jh@hCéWIBIbg\'^.(G/qZEBY4 B2*ZFiB:!_G<|Q#gk%]>},~I6GbͱGG/U*mb_T/ծFem7 PQM+ "/50Ʃ/f{Tt|>q|Z)?|>~|pAIz!u&O2f_ C:罥B HM$F)R\?1}2`쌝* $;U6i===O[D(tjR}pG޻R~8Ae!6qB L{I5Ehv ^kbBז]W_P֘-fKa %YޝW399 ̡{Nj]kUr.2(KT1[9Tƙ/E㯥tZEt#^Rrq 6(Kțh0F.H&D:Ռ~{bMuD&=lJi tJ*5cg׌atagq.B£^7^fdP:Xf3 Cu>ՀFϣ[TѨ R/s,0dB$ )ڛ1 V^PQqݖ'Ӑ=J{8j+ 9aAێ J+mز! 0ṥZnm73Icd3 Z-=$CRϭnnIZbwIXԹJ+0c7~vaX 1[MYǾ-fmٳv`xPy6j# t:2 ucVL5p4jǴsaRYﴠ2*C E{rMK9% ǘVBx Mi |XəˆǾfD3bψwKKywR'%V*wtmx`*Hϔ!yOhIhJTqn&IE9d) k2Jxc$@q(x|}s?#~Y>{&nx gzQhF3]BYMN? xُǪ)VϪ wd!P̕$2^f .ōx@/VwJB]H E">q+A2*gFX^1DiV@'L@^  o}߼qh5ZGJ< %0 bZ',p0/e?Zpz2;\Wx H.ڊ&_zS8nHUgyB疒v9ۭ5(^ׇ_VnE̛c`EE-u3 57_]Y|9୔Pn<,'kI1+?c͉yV)϶oh (Kn/z^joDs2h}(9icRҢ=l%0Sx@8 RaМr.0Fy YJhI2%( 2\@I 1jM9  hӛ5g9v\K <|9 *\$1I !TS"b ,`Ij$%6D#8k &O):p*7D>TDk(qԱd c+*z夜i-3,g.}E>6>G!|툇)v̻Am-n G/4zA9Ob<#}bgˡhbMP‚7dZ,q$O FěǤ_eϕ~=wUlW x6 ND !r w sg1˄p%M /l2IB1Iψ, s 0%5)8˩,Ԧi&{Z; ^N2dz8ߑus2&KrʸρSY]"phXNݻd B IAuT~7bP<>h]U}ȥn/fհj0(o+U>^N}2+_Lأ*k׿~R ˀ9=0"ALy/DxP X#)6U>{n7Ǭh/f7嶪O,ܫmnKw'ʿU7]ۄ -U)Hä!<)#knu {V&x3tܜ[nAOߋäy˒ɻm&Q\":4<~M=pvmlK#JIh?kT2jUkkqr9iP|F5%D]T‹o^ѬQ\C 4sjRVw_ntldghZWu3ZfN%fks\GP~56F5A=al49bX밭݇pRoW5ѠC -ngtvy,sw؂m~Nнwᆰȟj{_Ίw犯{lP?_d'sgȫ#B) ,0$4g8`_ڎ*kiTrV֊ߪ8ǑQEg\lJ.ׂUjt,/ͩ1yC-U0gX>sA=-cNNjvn-WkvrxWf;y>9`I5 ;3\ ]j-ojj&wzyJ@`qpz`LWkLWA+Sa(Y"K=][ !BkXW r*NW%=]!]1™^u\U]_լybl'lx|'s< =]2_Y_'#@$X La\g Q.9{oꌞs&8Cܜ 7gc03Jzn>BnBrI:DWhw*+tUFiz:FR䋀 NG]+HW*խ(JHF:0pUg tQ tu ') g/<$x(gezdi]U~2WXE\d׼(/cms3 ͵ʟT<ҋB/Oب&Rr] ؐJF)xe ;OGqQl4šR#B L.ebmfTo{N)}vScTg'fmM߲MPz"+P%Z3+a`ţ*YsWr.kYZXZF OL^asSQ0ѥE;4%0Õ^hMG2~{QX(-A*yXzZ0t0etzڷꩦU,xg*Õ+tJ]=$#+K!b+t2\h2ʵȞ+, ]e3`FԹOW ѤK*PUYW "NW%=]!] RvI]!`M2V*4#+iP\gh2\ڙUF)`JM:CW.ЮUF+d*c+zc"CVߋ=`yf)ʘ8rmkF^dDͼ!pRcm_ۿ4+o}z7PO~iox5ˋ/7\R=p qcި~cf3Rmcb)ir7I)٦/+q.Ͻv$?(G3J"T1(uHN;+s8Rʒ$ ǢPyo|8^FD8A]w h Kr^<=[P+EW(q^~Uj2??P[$9Ws 6E G'! 5p<$Dh-8GE;/M1lٯ_rd,W"rҩlзɡ#b$";96_}K8W9?eAf4g4N KFϻ&=GKJAb.xhP| Uބ%/g5],aURPsS OQOT ߓG7k|+Pjό҈ :ƻcpzVKr:^6b0FHSH BbeB.:]LF..{%W/kA&ٵ |tlC~MK|0p`[S($EATakIo93  iq.@N%Pe$H@xF@1 K̛;.9_a,$Ҧ/T3t] %NJǓ/R]m^rny^_^X@Iǩ4:i,'@YHI$JdШ0ǭ6*@c2 &NCe80(3Y`jDzM4,@sz/Ǔѧ-g~ge:~qxϣ3Jܐػ 2b'$pm?f\-QqGq'F]˫껜?3G}gwC*g{\mNo,U[Ǯ]n]Q),BY=6&B{+AJYZ\mJ1T#&}iwFh H HCۋFmJgS[z+ "$QPgP}JY޵q,@#B`ɮˆs<$BЗj1MmT )RP#j(ı%r==uXUO+TU"䅨X!ɢ cO}PNrz0b )ͩ='Ζ0:&a{L/c䅎m" maLQ|0u#_PQDDDBL61ǃ4jH\i!&EOyQGv&']$Ё(t"Rd Z8IQ*bR(uZHG"#5B{4aI0\kK %(@yV㞏'Sf NȬ}:k;]lD K[xȍE`(bאK,(:ܠ߂aADHFE+=7&ׄzZ^Cތc= Ɖa2Xl ?EP`Ӹɥ7Mz6L?N`W^ w?w+ޏi3tojAn3?3ښ6YӴah4}ml{BHrKѐ b:§zzFzU+ghTBb /-ӣ u}X<斾$i \VJD{E锨G#rlɭ5cnf e9z7>U皹eSwmBz]B=sBfPq>nFp?,W‡=n}*4i~?z;7c|Q(fO]܋QqZ)Sa 6(JzH8EߋvX$Z!ik^P^~uH*kȚ˺G<Oo^QW|k[cJJ,IxKh.BNG$h&(U1ԡE ^txuUO5y,"y3"Eρ UJQ ( a^Q%| M<]~ 0|uZ|8.^Xs|R(@`@Nq{RL\>5~惸2ˢ:ݚB۠ a5)<*a|iDcypiPH&y= P:@T';Mђ(:-֦' 9H+6Uc/V{=Imx,[ 56)@%Q< o9!4!%K rS=L}͈'u|U=+r+V< "[.H ZSQ"@q% 9=('c?+@u`ia}D}ompԻ䵒4>D PCP:@lD5Վݳ6g< 53sE c5έ Kwj-e&eB3R,rE&qƩ. ,Ziogiu&wmtW?&ϛ e_j%hkOd:O mľw3d|Lq0F9j&.+UqYsoqid~>;#vnKģ=.q4G&#/ζ_E"Y|208fp ܋gXdw_%۲je5GHfg$"U<0>7.2-Òy"'$T2n^-vvsK*YL`|RZƜ}L2dNh9><@FOҶ3-lA<ɐ.c~ z0ۘ?hnC#CāN&il<SW(Ȥ1rM¿7 ic'[l0AG 3|j-y0rrH+ˉid8ثTjm+}4&$N<ZhC6*ZA=IY _Fl>?+WZݒz"{YCPgs-IcaW̡:ڏuj~Wz0YKDDs YlsyҎϷGG%o/W%Zy:`ev.) O*ܮfi;FO?l0[7yS ΢O4/#8A,>a;[;Zcda[}4m!cAYhhIɵ~" ]={:ߋ|X9פn --kd*vcc$gWNo~MGݼ:h_\^ /"hqW'l2ߦzs<82JM%.&K?y׃Z'ę˰F-M7(Xlc8y{o#oӁyJ+?8ڤzocGCv]7~cנ ~l6aocܭ$1viH'Wk"x9'.sه[Ouoޟ:^?]?v 3Tc 7h_nY)dnwoʾ^pV{iGp? 7:,ոXYuarΛ/z02f&Q468PCE0цQFvFCk(E #KfMvQ!cT,0!j zIgR[ L2둁Nɥd%ـM)zcp[@L3 "Gp;+-K[[&g͐g=Cfz;leH^ep+!.$#ꌺk}p "YGoO޻r Mȴ%ML0z$ I2z }%剚*#͌4+ݬd2ɦ HE H7 =}i'oV4&/e}G 7Q7CT_:G۸hɥi ْ`F=fg," t[^D, 9JIO蕌Z{fEwГ FGNQ$Lꛌy*) e,đ;/t[XOd^zf<,N74lz}v:։ &X/BYHfE9S^#1 TQ҇s&!FdP= ^) "%ܒ혱MhU)MMb}Mz=Zύx MmU%d& F>kJKDHs}B9 f73:Jܺ(jwmt,0-țUB[BcJ0 g9]w/9k._LN×!EᦸvcN W"h.4֎Vs1Y)lL_M!s3D˔B@&gyԳAy {./yErG,ԇ׍d*$6>!-%úpm-ĢH\ou|->NU 7F4oa$=doўVF{eguKRTdHg 78 \Dd?X%MA)gB,KK&w2f(gyFUilD={vE.Z0]Ԫ;EĴA;2k}2wO y<biuL Lӓp>_1O }bopMc83,4dYĮh1#6'C~aM~)s9IA0؆=!'O_x ZotNl9#jY# :bn88jȂ;flclnq*sRIl㥷Me^'8N,,HP!brvqR8zk]8۪;aYx6j {O %6dܜEwJ% uYiȕ`_Y[ Wqmhb(׺*2f"[WڤAKW$2[ ZdCUQF\"cE*-W$ՂV㪨vJn4W$rV \ન0t\:@\Q(NQ"Sન3XTAJcMe\\ւV㪨\:ppȳšpED5*rE5UQqUTZquV" W$XVd]]j{WEQ#Wa=h~>a FhP~*K+.:ҩ]NUdž 9jF[RzОk̏_5FCTS ['k=ֆV^\IN=+JbgXa ?'o7Gru=7֨GJbZf`ͮc[7{U3:zAǝ"'e7 : F\mP)E5"BZpEj%Tꁥaq:[ʖLAXWEuUTW+B"X \*:J#F\ @:N,rׂvO`G\ $Yֺ&gP+\lv_w$4l,%"\)jpEr]ڝWnWTW͝-`kWE-,* " .CE؄O+frԕMMV*:b BZ0MjbC48:iKX+4Ji7BVgGj>Tj#\gY{6+q{ڴ7@ok4pgFUIvgdʖdZm)!H"m1Lӟxp7Sr7t;awG,9tjۮnj FH]J>Zɻa%W U:RY=RW0!boUBQ{»SW_ H]˜ѽQW ]E]%luuXD;u+9ہP틺l9UN]}sM#u3?`Bu}ZשGQWBW+!(bdoĮ du+TWi)/{j+)$B{DﺺJX2٩P])dmehN=:DIknxHMQ ]"EM'l*J5hP])V6Wq鼎7޼y@&wHMTa,!_Ai8 ga4eZx( ^JE'/4|ie-7_x9f8;uE ^հ (χU=ol0̍` N/};S,tZC+v~_H.0)qNmjQk9WPgYRpqqvP|nszj1T)b+t@Hm/g)Oc.*5X@Uvў?,Yi>Qo 0̓som r_G#no~FO\ ϥofz%_ 7- {t >jZyc]^y |x~?U[gdztN@/ip`;O甊$s$O10 Rk8\/aI|Nw>.¯؅jD 2~;o65Z r02ҏН_Hf8L&Ӈ`O}hRYggvy`_ Dwl<c"ᾖIS~>{v[4N.d/Gt~w rULKвS ¢YpVMwLSHeƲOpvx1HF1 \O -hhO NWsRzqTD )xo~ :?#':'Y8nKȮ~V1hEQRMH@7 =Q?$G( G@bg]G(f 98>XA"x7&nipN_F>|TpKK:bư(yvL`z>X5(QAd쓫`qHpEb$h7(䲧ώz?񔚜 sSeY&jmIZ%h5 >=&{^}@ZC[^f9\>+Dr5? !2<46&FC%2NZ$~boW>k`tl|#~=0"h V~Oz-ԚZ'',+T"i4A5# )2`4j0X`̐e(V:`$e8@&͗F+T4hiB$ Lc,-g!)fߙs]&ѫ3&GXcx;h0{et4r]z(Z )"-#Y$!:Auft1 =iU KhUĭ3g"#bMjx&paZpI! d4DxŜJn[ca[*[<8nD+|'mEDI :z+l%|`06oa=U:pʜ;vuтqsֹ-XEG( Mģ=XcǗ"qȶҁ[5Aϧi`ufxlՀHVɪ"WpfZx'̮e5l4]60$Rwm2im2LkbԦИ#b@^ v5ѹvhb6}I6lI_Rȏ6tѻ.FgkFҔ=[}f(L'2Z66U:&c?4i,\(wp)Bl*ƌƁZ 1b"iZ+Iv6q_ŵ>~LZzaq)88>Gij y|W,lu+ߙzL=pM&8W6lÆshͼ8cLB b'lj:aa;kiͫJ,1b%Nt%cCitq|HjN%1!$,M4e&#QHYu2LwƦq,`ZF %6MN%["ZWm UQwhJ-TGe \:,R3\27kayEК5tY8&0S԰fo]AYsmz.<6Er#jܽEp66 :A}Gw1xgҡtYYz3x hz E nfJfP29MDk%7$7? ] σ,*|.xe#Voȿ_MW\y̬f \ 6Zs{noDJ}'K?@:qONwyFN<;O#+/O+f="3T#fNG'QɂӒGm4ݍ0#z!UANb?nߍE8זKe!Q˴f8GQ8Wjm:gY: !fmS 05i þ}`S K)*n5!be8AX)攷d =!"}:)RRڋS"! wK9 Q ƑCk4Xf4j .%NDw ܚ4;peUF.ִgtod?2؊'"5D$Hy{ *J)l@5Nxwg4ϗy4XRXd1z8!t ,<'p2uN4O+' ߿Fn('{/@~YP4 Jd-1Ҵ^4Ga^*EƑQ7֤v}'uٴX#M6 d ݆ԝ=yOs$[PIB7o.ξp5NL(|$@uJ!bcb E|gmƷܶ~Z^xs[ä^Gf8>5K]>~qrsOVa[/Nퟙd~lkVzy`yĎ/ǡxv2IjT8M.&]GЌ\nUf. r,^OLP$eN_ j89CywާRlj/zyl0 3 3+*vu:5Hx<`o+G7-j?y;[nUՇm]yJ:̤XYq^Gw.@~)~J\bţωGl8Qϓ@JH_G-Pú5o/I{kN^[\So{)q>Tf*֥ǐeq\ifot.6,X{} Yci)Qy^:7DoǺo$'>[41ӟ|=`K;hå+Z|iՋAYN <}`*~{K"r׭0Jʚr5jz0(㛯ƈi&ժ6e5>swvC(5'>!FԢl(79&=<\-l6ztZ)n-{C"uck6 kUD\:J+Ae//)ųdž*I%?]moIr+?%U&`{Hdp _leI+RiT_Lʤ(S-qcז8aqyEՁD$}Z 䒇EI2 {,~j!8٬~{ ۾q\}K۫>Uϫ:ƚ<_^т+yuUo&75Y/lrϸ%_gޓ)x[-'U4 1)Y_G8ZB|^NئN>ɓջ';C[D8p>ūNO5I㢯]PS5X:v 4gf7wdU_cWn|=!AW;k$a:twcaF~4I(D7m~%ny.Tmb+4oVVF^T Rџ?Z<;V]Y _LF5Z+ <{5Umoq^d| WuuaFGը)U9P}m'4i𦴏TfBݾkMhN>-2dqVъ6Ad`܈Br^ cbד1م=7-Z߆H^T#R?^{" z9M۬S) %cIyPuAt⠺͝a cbO9Xm}L9k.1&'@ 1ScK%CY<]st ӠcC|\0Q&wT}3֟(crh0uFz] ރNvkE$C{*Z`|ļ1,)tdbjIs"@)"/EBI20|mIw( @Jt-y8kꋵP0BRY'mۥK'F6B2m zUôv8?T1#XCiH'VPPsQfX虘]ʈۡfzrh?hЀ,ϭ?{NEʮR߫bT>R6ؿWDZfe'Kp^׽ :BĦO /ّTzrt隉C*``9fJAJj3*|a38ƾG_=kWFg7h(4?N,s͟ЅgcK#Q.z)Ig0q 0pc2EoڐM&!B"KpPoC%{)Pm Dyp()1k챛cEZڭPm{m3zާh6c,ʐz  欑|uA̎#j66DrZ"3dFEGȖ}M"8 \DTtIuuy3qÞ/V8L?GG=,*QA+=6R0`ʪ!)LT ^(cr,L"s|1V J3(:&i!I+Ĩ(hi&~}GMI Cu6ӒCoGxƳGSS5M`J>b s<8g$8ŧ񇭴@?K?4/{U~%Y(m'ُG~m/%@y{?:*eVQ'c^k$ ><;~}88|ǬRQ hWNFY' RdFF!lv:/ARD#Y0eo$:Ouf* JPZV < ~c^>w_.ㇺ.ԂF]>7Œ~8v m^u(vČxHx{ƧY>Fgql;wh1X%4 ^ (0Y:aV;O?Qu:zpu:zw/N>On.z0߶N7U/fߗa4K%p6ۀO|`yACc|B6{Hvc'M䐅1WtJxpu +B4B4| .䅫59 G#91N94 4jPW^=W?+ ?u珗׷xaS Φ~}nĵB?Ox VD%gl`L(GvὝ 8G^#~X~_3Eʡ|l.N >wk~W>[ /ýoXx/t?hOG~K|Pˣy%gvs}1;SO?i13f>o>Q cN^OX'D~z@kr Y"Yxvש BHVt90(D0lıW=C\yLm;fF  (BBTH61-*Lw]k*\42Z :)埲i䭏 g KYeSBVD#T)Y EѤ%#3q]\>Ӡ<47!Z.+_Nn>9J1]HX~KB9˴ )( xlwOߦ`< QUߎH9IyҚ|4`1h]$:ƇT:IV%2h;'ja*T 4\4\b-jp!RQz YKK ix \(zjCJRNv5+_3tgtt|H uX`Xea-?r#y?%ɐԽvޥVFҎ6 r;N3#Pg\3*:3 u:bP]]pՀE|] x=JOGzt%Ҋjvn5t5@j܅ܑ!])LJcu 67 %c9ҕ7NY]2\@{t5PxgHWqEtޮXi5f0;{b%YҕP[˙wn~v-_nnWAw-<گtZ} vsy~;7O18>C /ռ$Gc|~I]] M16;t(QUyhZ9קU~%eLPҁѕ3+VDWՀZjoȒ#] ;DWƯ\Z@jx뙻pu5j ?w5PFHW6]XNLqo |Ƽp#8vxӊhz뙳\n-4=oJ;3 tk閗v7Ww+ t&o[ߜWBߚv}6"uɽi s_a$>;>)W研74]aJT8)2]_^' _Emy@cCcFWq!6V}/gep#vs^ܒ}b7&`rot CՄb'I&MYLjN6 0*<՚Yp=bm0 -b=^_ _"@}\/z_&Cp+=c pˇHhjtGztemX]%\fZgx:] #]}bOv=0WcZNW呮%]vEtVh5t5*ZONWe#]=CR_ mu5Zst5PZ{gHWnEtjjLx=[7(=՟гxԠt.*#J11n~rz:Z &mt]yz&(o/1i{K4}@Q#/?;{O(_QIgd]lzǿ~;~S>mQ.?@y;~.ޠqt>DϣWv۴!lňE*'׿̜́mk?~{ RzKMd՞7lGj]6?~CO܊ϖ_XP*D;28 E >8|sV;=G +[46GA=ݼϻ˸4nK HWmoߝoL=7ɪ{cKVB%1[ \R\<(ⴻJ,S~yۿr}oˆߡlOP_\&]n~Lhr:9p%9c"uWгn&Ǥl$FkƠG?) Ʀ ]H0T ;cժO9W[*8toK0٤H%~\hή*uS&bmXGxA;M6ugRiI!v nI"rZ(qIE3jWJ=)jv>[7]8Fr9$\,Zj|WCdT۵2)g()m^6Ԍ1"'ZpO#P"1ZGfḫE b{ERb/9%|nwD4!%}xnLz4l\ɣݳWˈ&[&K9w  ; xB. RN{'mvZ ߚ ED^7&#Im]_~d|HlQFK{qA1#3X2.,x_ħ\?oBUi5ԺW%i(%%Qh1HG* }Ò|X"iGZ$B$J%((E?ј`Fs€bkcvgIVc,N֚/dYjd]"FSe,up)TXӍkƧh98ꩳA`w"Rk ]RϢEP(! S#XkCu:v4!kG+/VL$X@% v**6(:jBhvouh;m^q E @JܪB!VeWGWd$0<3.5![pt%ρE"n%l0 5ƺz?s%Hs -+2.kԣ(-հ֞@Uh[%vTZ[E8l)s` `; AKʆ"RIX)dbc!]Mh(pu 8R(. d> $׀R %꜡4-~+*әPBg d,q +dWd%TR ++ ȸAæzcHKe ȄA .݉hPFt[s : vlvM0trDŽx iKI!0TT\*lL;9Yi{K 6ݙ(J8E5n,\ Bˢ=+ƃ# 2|6T2 [w2J&R2#EUDI)bҸ+HϨ`-hYm ,y&*Q8Pl PeٺAoeFCk>84i^g5r_nݠbF\j1Qo8*7eH;8!"&燎uw&ޘ黫v9i)]${/W]-zdWe!I|txḰK<`hJ_#똕LdIW=$:\e ,#<9?V=̗, :#.؃VsHVdg JkQm E%@_>1gk ) Fc*L'"`J;H ׷lfX` u7߁uŢ8(X; |rp9A~E_t7(V1 /$BaE#$KnQÌ";$W=uK[s :ɶT8Fgx hN,1{\ȼR+ hՃ*LJm4OI,d, fj)Bv t9yĸT>@McM*BjlpgF8ZAyEVl- WX M[̀z2ٱ.dčr48֓8 NrlCɵ+FR,B:h Ny^CPh̦rapU. q@D!C1eqႃH28?m L $93C4~k/0oK+M݅2 OH6"믯_/y5+6?$n演w /77Km[KMg?vM/f{iV#KIv([%& sӊ_ :"M?^iozwhhrwfGs>L!t4鞝QO C泔eÛLRNm)#'Il~\Y A٪}\s=(cYSa\2= 9"LL=CU!m+"%H{F d@(@C%*P J T@B%*P J T@B%*P J T@B%*P J T@B%*P J T@B%*P J TU)!R(R0ٶF h؄RQTbQ J T@B%*P J T@B%*P J T@B%*P J T@B%*P J T@B%*P J T@:^%4mRYx@iQ{@6atJ @-*P 8ϨB%*P J T@B%*P J T@B%*P J T@B%*P J T@B%*P J T@B%*P J THSm(5J +(k(Е@ 3R Y@B%*P J T@B%*P J T@B%*P J T@B%*P J T@B%*P J T@B%*P tDJKYytif~X_\_-xʝ&u|x3>Cpm.q=%k#\)vt%@ At `W.6%YqS7R]t Qu,}d>&L2+! cCdW.'ίߜt,)Xj&X=#3 P(Vf+NSRO?܂z4K54ZC鄒+#i+Et+JpMk ʗ ϧ],5ys|l e&j?\]R+tԦ*E[DWXsJpi ]%:]%#]!]1ɩ"-+H{ƺJh9?tJ(ZWHW\*T BZCW .m*YHWCWBZ#Jh ]\*e*J*n]%- '5 5C %SHWGHWJ))Jj ]\U<ZNJc+ 5m2k]%5UB+աUB4ҕtHQ١w83{[ .Npk K':KJUӱ!KK蹙%-;7~샣a#ϛA8zK?v訐>YR6H/D iA]~; =w9N|r(U?dhew-it0uƍ(t̼ 1 dg5؜/$\jBW -NW %Q"J i ]%ֶ}i*JC'd]`MDk*m3оDb?W+im ` &v?y~(E:BDj[DWȺJpYk &R:]%!]!]ubWXXW nM Jr`BI7*Pw/%Rץ֋l1 QFzy M<_:Np7ozaq\.ye˜Ox zbLW^|~.75oæ$en.߰5z_pӯ3;tmR"DCoJ՗(mqsZ(d̂&UfPrS7vڱ뉳ݜý_O-ͳO-λ2X6zٞxpTBKkbe:7 pB8K#6aL<FXBLb:͋(O vnFj>.9׽\ kSe\AoiyypmW&ȇ7PN>v4ǫ8L!]*~,#Ngi_n 6m~| ?͟f/3K,/2ouniT&q Dy_y/d?:ER.l*1OYAo{upWLn7LNjbVlTYywNyh>|5kl?ZX.\|ˆ` jklܨA*𬲉lT.SKPN2SJZÏ<3]Us̞X^! 1XC 4=7 %R(Q2͇h>+r6[" eDh &}8,2(lW_Sg3>$OJ8q__*BЦ:h>2a 4 rt-qYۂ"Y1Z։.bsN17^é!m`*HM *'ƹ؆#121.F᧻rR:{Zb>>YML?tۭчXm{o^R_7Dbh1P a=LGcQ(I` UwH5΀DёYe@(C:$АYj֜{IΔt ƝS)b9+.g Ccp'`=3gCyIo'`W;D) w@pe-n(q/|&aXRX-z=1?"f%EC] !Wg# |kcך1#y$.(RD]D 0Ͻ*H׸6%D eZ G.5~%y&$XlBxjZ2bX@ 4IUiU`N^ιxeTGWne%Y7CCf%K;[$(dqVO{A .[zvafШCY/:m5[yӿ@_!<Hߖ%5g"F©LYO$QuȧUhCMсz//'F1ロM|8N_TGjrAx8~t+?y_XL:.u$Wy8-L'_w0rh>.f}vQ{=jySF8Ofޞ '>,Ru1ē]A^f]y:/np ձ~|wݓ?z׫UP/^1$ilg*{ޔ.k^wP?X^V\؟ 㛔oy4:_ZWTav7iHnV\Gwg.S@ek,@Aŏ֗piLyP憍ֳJppչز}N' EIzSCƇܴ_wYo/o8Nkx%3EKަz 1o 0 CGnX%wѸw[z_J*e\vZdijj̺IZJnsgwi6Rs&yjT 6g)lOc[ 6*kLGރރ!yQYwQH7:+'H~{$g# )ě'%ZѨ\H'8-wsK<0唥]DjR̋ˬ v<6+als6kGVPߖY๪ź{R??' TК0.IjhES"Jd`e-e*D]o3v@;s6VD"h¹$bR9W1XHBUKD{[3 $$mOu5Q2w3E=rs˨0;)R8\qȉkx7qĬny ڶzbf=W,eR7Y_࣭f?{'\nMm2>r恄UN 6/eѮe=ߋzY$,]L?Om0<#R,д TZ-O0` aVqJcݣZ+P*wm$v7 C.b,V,SZRW俧zbJ&EjJ!=3US]m0v, R:r>"LIIY2鎘AI^$l  oM8mՑ3gL|%mR}]Stw<;g 7#ݜ_EZGZzr3i͚}y'pY=̷}IM5ͯB8}osYݓwQgQO~;C[D/8p1WguCԴzn/ז[oWa97C |}HVӽvJF)P$=]ܞ̯y{/|<fGc D!2,!m.o.ZM|#KSƉ<è[ ٍBoOE_~?W4ٳb^mZ3OmB86E|kW[Yt] ^|dk_YaFG˲s /*:_iT{ٰ໣}17 )`fx#qќќB,1NJP0Z&e>QHΙӰdLz5&0E+ ڸ,k㵉2YQg,Re'>:B) _z>7>оMThw%|>wKK_;K]t4%̜4/0X=u:}~s]T(tf\(tG :2lȱr}KM!G|}y6h P(%#8zfL,b]9t ȀD8'UqPZ_IěeH i('D/%8#29~wwƉo^-ի`_;_ uc)ry"˷c߻ |?7y}[[k5ʚTdžet]g/OlL'?[t::gof"L~OEb.NK2~2ak^C]~<,?T^/lѥ9Y͠"PyIf{^bw#?ӁmsEC6#%.؍M(箚OOڭEʓq||k~_E95o>p2t]q?{?E/_聯;ru? b~w$o2}h,!δ j#xP{4z=<2a,.7|tWb:rλGW 0TQNR'4ι`:P58?b#)] mK ] @)o$:tŇl1I֫(} T%s"ks"E)"d)y[@P*2|AR0`ʦU6(V^1M#w 8kv7_Jf X]̤%Z,`[l6̏8.WAlf%E8.1.qAg/JP|a)F+qdA<8g]1`a+80*6^sTc 5!(mՏ/T=ՏTR5x猉NgC>||۠QJDC'sLëFd9fA=ڎQe0( qXTКdE[{c YCZh -|[]G @Geo T:O j™x-Vp!Cxu`Z3빧qaty š>7Œ|8vj^.Y1#^}!jt8>u'?lb[d'Fo:0(DL3q%)s+gSTZ6t@1 N%M1* L}2cS?xe5x YX<(ZSUҮ;SD@]+Tw ~8u? i@1ws̫EH~ gB/$b}@([z6~ )nNotЄS.`Pp{N i80c2P$la "bE.+Y0^ (R&Q6q%4?_^M! =tc0xu;(N>f 1/A`tejL^ߕLJ`|H2qb$! ! )\QȐPJ%|J~UJre_1(QuM 9Bww" h;wλR|nʗQ٬? #|!iO36VVjWYdrm8l}wrr]:LJtJr\NF/@q}Ր_ESxW,s!S)M UtXq^ 6Ķz0G+ |(D|&x8zw~f΄jgRT4HdOF$G;Lrژ1)IE6m"#dazC9ܙ7ÉOY}~T]񖑻wVkӜ2RH.FJ :4t :/ u*dmb(*c藓HH1)+l@ѧa%Q0dkknXM/Jfm?jއMf0'4/Io7x$S@B"FD0fAֆ((xiD±n %jBX DJWx!1͢>)pJs*ߢsC;pڣ`NHuZ=G>սd+WyGNUQ9A6ҖBA蝑jCLܧp/KepkI$A(jG"8qԖ 84OEWD݀38Mb4͘" ^FD,cPxK~>@5JCYU,&C4IOXx<_ܵ]='8Ʋ(N5QY#Υh/lDQX6u@B 7G'1ݙNt͙OzpM+%6FaSB!Fb@%&$nG`m͈˘z0zLEv<ޞt|=Yo#n`>pc䢰P]&VJI"+iQ:AM@m<3M0}4dW[=IЫ9:MNx!%nWl"鄏 ܊ސ0l69fFPj 0!ץX}K(^ mKs}.UOtq+C>ݧUUy炾7i.|;+ZT+őz1D`,PG!Z0[ aEiTS"sօ]nBmfm=u!bv5 e4[} ?GFYviN./entyzG؅vkWݖ?I;C~(|Rs7ʞ|lH@RلQB%S*JZ@{ZDPr&9U 6<B6ـV<6OW@DW ]!\R+Dh Qj1ҕڔ Ae2tpM-t(9JqnDJsW8dZ ]ZKzC2xtROKWtJsMJ-t(c+tU[άoRn._T9βjt|DFgg#Ol5hN@8(1g 5ҖJAOo,M`IE24pIzciMb1mc w[X8Y}zY{\#$~]#(D-v!['B_W_;"6vM{kEDϔ:lO~8pF:G*\^zL2_u4{IA{I lit ksDŏr/H̃w0q/qrqvaM?YK7-{IV9Hq%{-i=od$VK0'ec +82sjIHf[ ս_gFfHOmގPU;5t 0t%+ۂ@W IdU2dd#+&9:!OиM.5t(9Q]!]q  EWWT w]!] E &Bt}t(5J*amJSRId2tpәо@WCWJՃkN2pM2 5\c+ Sj2dJ ]!ZNWҘ ^~*s BfΨYMHET%Z KCLl2,zs`@ɭXY,:+lɁ.#e_a@|ȁ ]IL^ajoo UK0=ZZa9]-jS 5Q ]!\mR+@;]J@WGHW}S*l2tp9O+@i$[ahJ i ]\It*thu+_1ҕ$(]8+BRBW۷G]=]Qmur :8-?Ǐ)*`T_bv9N@5JYglgLa`yדwWDgp Q.NԘ :ٺyA OE+n;YZܱyH~C 7=c-PCj#(Z`ޗI".Pe3X,|rzx!}j; Fk+h??UzR]+@P r 8ϫx_M.2wkCLrKm\q_‶# wl>g䡑Fi︕_6Myo>13-]^.7R7n]"QaXM(nޠ[fۂަ@J嘯#&C}݌77/7#b=VYe|hY.ǟLѽ?OVG>Zls6+ks>$jkɭ:Tt9-s>~;Z+)QGV_Hѵ]-tZåKۇㅻİ8[>um[栲YKWs '\67BFZE;,P]vxI2c(cdWg[;|Őޚ0{Mh͈[$ITn|v1+qnΚzZ$uk 2j΃(ypPF4KSԜ銮֜؋}+@i1UлfQy0aLUM "WT%gs7 ~en \d&.ws6^l>Dzɵy.BLT0=A77*?83;ju#N<8 vnkϕk䰗,3{k6Gv $ۄ_74A::Ɖ\:^a2RH}-UĦT SaZ@*~jP22L-<PtD%lp .%vhفnڡ=+ڂ@W; #m\8:oXI0j~mj;غ#sY0m$q+pTݶmӓO. D~VA ֠SyMգW)z?qqJ3Z"]ߪsժ:g8ߴs;2.|ӵF"e;n( M`x>XV<^zOt2<:MCWMg"p'$:z!m -4j9#;#h%$Ĕ|AJr%➟vv +. BDJ V* M-/9"kQK.hґq "GjLiJ,xs-(X LfobF)9qwlgihg] ̬Rه.?-51k|ZR+ԕr^q F ZRZI#C8j `]+nR1fLσX/sSڱkel(ȃjȕPPXLmurݫq{F]rVېC/u%!hߖN=keྲྀ_UԵ[ t5*Cz woj ^wE%Kʶ6׆r-/0Wۥx!Kw*vwlfx;J:ͲŻa+).omK_/ŇbxW4+@z[w0#Kwiކno´NMT+ϖ΍o.oxz*N3 =w gbH ߆/_;S 7j3fm7#z!ŕ i o|N9Vkt9r_+mE 1IrF-,GVYњ#ڇ33r;93?M'5}/D}s\Õ'G$x搐jdH$ "YpZ sQt{Gԋp1x"Dꥦ0+CʘtΆ|ZQÏզt@dG) v5d6(Giч ƭʦ'_FY Fo>PDεi0wTM$"6 @ƒ$)rAD) &xu< xr叢U%&y4 y) Q,zK2=E:g.% 9(y:< 7Httw4<g/,9^=DeAFґ2(e$H`4Ga^*EƑQ7qa-̺-Zxv[pH`hӠva' #ԅO 4ZR݇08:F3H2 ɍ%bcb E wlҶϧxmf ,x5+r^sKW'*L"dl&q>pE=psê?aQQ 5݆I_d~_m[ܔErD~jXL LHGRa3nwzꁦ?ǓE]iN oOc}U}2_eOR+th4tFsk3a.htfޞ0:%XS,Qͮ>{{^X!&6,U-|~WmlPu0٧cR}w;`7nnoXi)-t[-. ML={3 3܇Y~[vqàvIWހ`z-wͣOaw7aYړbFfէO-Lu}mq&{a]o?%y3sFեωԋyo8N w0,lJ.sb̪^->=%esds`ɓczp&X4Ƙcd<fHm3?&˧DIJVX㺋ẻX`|cQ5OJA=[^E41̴`kc((4htpiդ#դ{^Z-u}@+((P).VjɅ1F'!r-81# nSڳ:bm-u -w|nn&WM(6gݫD :-)##9#W"M4j7ʱD"Ļ=_ph5Z`(3*h2:ڀA8R3ĝ*:u^ΆW⬯$;]{3+6{k^q4!LK"kSK :ǎ*;'`k*ډir B>jab"Jea}8ZNc&*##8K$<(ta z5v#$ F=lR t!ap)LIcv PI΂ȳˁt5-4=v蛭 < )RԀGY Il΍e p}59§(.IVɜ"Lf~Q&[80M٤|IYwzv|? k'|w6o~,uld(h .QrJO=L:\ù))=[>uާXjB3Wښ$ +WzIEnǘͪܣJz7:DHfd} lxWy\wRejP92.=+gc<%gO0Fπn>®\a$>d7~E/9iUlKrՀ᧺5+WSBTL)@tA©Jݨ▱ .;T<y,!KkXXLjA1ׁZ N_5,\(wp)B0cFtڧ Lň9+j$Z)+@\֭= `O49G 3] kmm A=!oppN+P`sN`tp{J{Ŏi+;WR+K X , `PE(1SIp5&D x$)Ca S띱Vc&yeĀ(hjtgC}pe!#+c`gp]9ƌSfbDurj- oᆺ4?|U(ftg6솮]xIէS FLB+|P) w1a{RnaJJAvu],)^|kg7_Jt^hS[/.h lr gED yظZu7iRuI%Ug<ՐFȞ7Yٙq6sFWIqrIfɶ(3b8p^](%_MUӼlV#ZLd%!8!\| \\ 5;CyTPz4^lSUٷbk{ .xx8ǏK&H4S0KVɻn ;x _{y{* m]I =4ҁ;JR+zz=u~>Elα]MM!M)&Fc'1\lJ4bdS`1apČ|əR2'A['5u̽Σ@6uAӞsc!d$tJ)yys˜(}-8[>>>|*s3H(q{~櫏NMm jդ) HS"3^bhDžsq*Btӱgs(SHbJF|lj}j[e<7rA1Eh/FVI1&㓱\K&lMsqg3nx5yA|t['W ?)ެ _N[͹8PKv!$ܼ>(FcXk8!},3w]ow5`^fcu%wT 0xk,Z~;7 =R ^׭0^H1L/s=ث7>!NVL-KLX֙hHWO6+wP֩M)9cct+ֵGF5;~E*gtSۣ1*P_2bĮ|0z3%w?{{xWcR/ǃ|@ ể/U V{d̄p`- ƁO~܁|ã@axpK:jx|Nԋrxp\;89[;|tuz9-ɕz_t{K775<lIXl=N6?!3-蹖hHƂ\3P4KP |򢣬qz|_n]k7˫տ߯Zh_;j d@ !A>$ȇ C|H !A>$ȇs56@|;$ՆCwhMߡ;(|0٣nz/[Zom%nm!A>!A>$ȇ C|H !A>$ȇ _^{86WLtPLccR!Sib\$DH-1El歕eGl` ᄙ"fS~9uCb C yQLeNI2lWqVq)k,S>gPF3_bK.T*$豲%|Gɰ<}4͜gM#Ǎ\/޸ =԰Ly7 _rc7||^l:LūOȅ" &RK l|PS 6 T}Ɩ<2y7 7_[o`-kP#c8*Af3~?7XV HdWCBlB&R w6)ebpS4tKos_ gUQ !(hT:@TP{4qFs٨WjmaP|HH˫N۷Es< [CwCi GcBE`-xkDAdTS9o|+lymhk Q7AnKᷫad"oxFwJ:.q+)ۖ^FFQ cOv-C~Z=;^lxM-~![8rw{d&IAˈl൥$xni+Q+fS8R">"A3IL FIe(y/z#3c/zZEO/y~4ck͘)zRHN@ C(B[!(04^t `;08yaP+`Nq{R}0E#ډa@i\N\{oauh³.CC Qk?<@l/u!376ǟtCbb||%D eU r"RJKqeY jɈanϧ[79(7͢c75OFg |0sNf2u\m}g#ܮpsêϸCYayǯDQvD>-%KjJ/+MrT9 | 5!<. vaahx: >6ϯףr:z?a<Q>9Z}U?Q.{P"G?O& qLGhvp&g)iM&vNP^/31U(5,K}pxYϟS&orC7 Uwk?Tqw3:XWMׯSk]/ ;Cd1JmxghfuÛ> eFt; 8y8]\ua|ͪ;y>[ʟ[5qwWaYړbIfկMu}Yu~Tn?S2VxIT 1;JMv:|Oq,ZZGG<{:ູ٦<&O' }/1QR|8ؖ]uUL 5Eʻ-5d|$۔W(f{S~{&fR|ChTU{FԨpyffO sjԦជ$.[&i0y/<1ocL̬"Cwsύ*psêGN9 @or?" 'e#,C8=E݉dH1Li'"y ޛDN г?MG:xeag3l Bj=(ӂ&Z,:Cd+SF{|zC:+*Xgc q .ƒU2=ńn#N0RJjt'FwFw^GktWvTӢGͪqG͍^.60NQ6kf̋uq һۼ r,f?8*Fq6ƳZ6xy3=Y^5;im0-n0(0)>4FUj%jsMk9]D%e8սlvOVSl~xs<6wy C[6FFr4`=uU^@p NE!߳Ѽ" N\EO2EO\UT-u*R|8˴EBu1;@ dmXؚ2?g5Ot*\=X!!>vs&H0u0P7.nÒx"CZ)0Gsϭ-NYqϿQ5:\xd%Y$k(z+Q! )2 &ǵϝX*.3).hMグ?|硳]w0x#q056'm':r5n#| &R,-bx!2] ^< nzWcǼ5*JKe42Apmu1Qj]Jԧлtjñ@c"jrT9\sS0LjHN)L;e鍡j A<8B(e: !Jy(M\:tɨe !չc^io5 4>FEPPa- ^)B] ^Q& % w5 wy׭ oF>K,CH ߂h"4JO5B t6-ʘHs4gO9I3\|W% CSdžOe,6}Gv=9'BȽGWG$٨%JQ_&j)` Cdp}ޭ$ZKUowu~2fU{WCv]u"O^kuZ^A_ AdžzAdRjzmF܂tW,aṮ]Qgn5^1TVz`BJhi .CtR+eBPgsqsfaRR$*ɭ5ثOT [)Pۍ{5;{}=ΰe|vPfV-v^3}!T*ݗ׸!tuJ˨8-4B HmPnO3D_>fl_3Ax戧F1ʃYg>Ry d'Dk=d)c,9 D8%I } {KDQRYkgΆ3z-)MFgjȑ_}HGeg0=\6bbֲv{~&HJʠ SQm%Hx ;gRdd S8ҳ]lCCC-}D<޽М[LG;>xX4|)c Fn|I9ۚ+ɛTSEZO_=7^ɸ`{g7;Ch3JIYd@y#Vcdz1N:>~7cZ6o ?Dۑ?lJ :gm}0 Fs 3[bVGeFR2}7xX}+$[b-5H)0Ŧ$563g73UF,P.䑹f.|Q.4KWƿ> i Ћo>p.׳ PWD:LAPnL%GoZK-"Ia1q"豑vCT+LɢPEb,JAZdGfftq^6ḄaP֖Y[f֞cOWV$߫}#dYs jGLP}>Qg Ui:RM[9EcZ\MLq1YbGA}xuZj'caPFt#3qfGK(|j $y_8GFW5ź"ٺ`[ J2bnđ=.&Py&=UY#iKȌ8=d~:Iyqqs:GCy1̋~řx,ԛDFW;5F [|jVpd-IŃ̋/GÇcYǁ|f\>{mmw?G!\N%:]|8]&u969_?і4jq oaک!:w|G|D(V@*ͪHTÔFf$Tk Dբp45=Z(9U E.1BX)?Fb[@gL+^ '^lۏ)*#J]hi>3zffcSzd'Q ZJǎ>cG`sjvK0ǔz6Q_h`Z5F+Ru9&OoS|A!k `$bI&)Ej&Vq.OQ_apCM}l$W+syB dR-޵[\Lw:q<د1,y6:&Ŕӂ,"Uի{.6$kMLn#򱫚~Y{B@m,O/^7K^^'}!wscMq)8P4iH14[Cp9u폪ntt"DXTp 5 .,%Wr{ ~b`33XcҊi̩_kwҨ gk2$KB?։3ǂ[N}BǴY[hÇZ{78sW_-R'YqFˑӬތ|8z>_A‘}C"D"Qy 7F~w/k}JKRo/[勒N_^P%L7 J HV؞ۋlшLxrmsw] y*-m6KeSQ~)Ѹ'rǰ|JHTB֕ al&RRї1a&9*>G&sJ! 7 xs=pq2{;ZǾ};edzmr>/U foLY{Od_~9^]QB? QRYчp6M{F#f24:;h}8v(Ci-ҴS_Oϖl~%]q/xr^iGIT_O.Ajn r?k o?iXoJb"4\_]g+yv >lG 6?1{c)߱f@ڪߖdMIPdR~rl?SI?{=3zp9y8r(!44[@hʠQ! {K7ȎA~Aǯ#IȅfDL C VuZl%[",Zgu+QQaP>ɇg,<͌bۭrk  M<=ɦM7X 0.TB쎖Cr{!69H?t$־vҠWWzp4]g:tAа])`n2tᒟ ]u.;]A<++,N:+kN:ZcrCtv ]u+k M:Z{tQ"t#t$L:\ SUGyl g!t΅0+vO:\/S+E NW%Ko;])` ])\0h_z.x>]ugztƞoMRKQl Ф%=9Bdy IC4!(<ߗM1@bF4cٲ) ڞ2L:vu6Em$y3 pPr)j3E 6@(Ω866%`%k#iحh)x0֓kU !Zt0f`+[c$.YيVgCz!ϟfK1,klbW IWG+Pƴ'մP=-Pՙujͪ1g 7m$I(S6ՇSF\q_{ j>Wjd(N C@*͡٫ E&[:Rj:T0F=&zc,yrh.mp6:ںyxo jJvgZ t iWY~w>fQ[ڲSI<)KY|JTԟ[2XEtLMĖcmLթ$G(:ڦa(ɵ)Z;~]snڻ1H j;bEGqu:G1ꀴl6ACK.`SSJ$g{C}B*ZT` m8꼘1gdzkڇdM3Rh+!Qύ1KNԊIJQ0jςfu Z{!xsB AXΗIUEJz& L9&XrI0`bUEj ĨR?9PT ó`Ѝj^(ERT婭$)+˼Bn±D5mf0HN#.%lv-7T&DǺ8h*Uy Xj{` ;ƵYٰ ԇhϞ]i:{X*(mFAE6z_%ݘ`b4X\Z*2V d(+2+K0W+ ˜`(S}5aRiTeJ5⎳ڨV^q=PT Ƞz@ҹY7|ͩ޵qdٿ" 0 ,&$_fI(N{.ImKEXd7oW9Uu q2M!@C9SeZc#nC3!vD )_J&:&|K@Z\AN[:*]:∬ @M 4͙@(qHseԸQcj LڲFx$/Eye(_u ~j2Wڮؕ(ދ0j< )e+%gT0-(^55$Die"Z8(VS Per>!ʰ*4Q]k?|p.i8 xS\Yvۊ~)4bMDq|RT1yQ!BNhMY;w3lۘb(KzhY)>^p-*ՂG{{Y.d!"k!hƛC G^X%O[АF%SdHW=$V*Xȇ TPHǍ߄XP p΀4D. iՠ(}ڄLkptqXGi /!!d͚+qP'K = 2Xkp`DP(c ޣP.O9g^Ls@ Q/сwK }CGzv@ @z R{pUf}J`@;뒄 XV't kP([|@cm!I1M(={Nw8= ΃/BwPJ"4(Y c~XTי4Id(J+12~R*qwq`DYUZUcfl~%dlE'J s 4n0V-li`F+fVo- Rtiiꭊ#X}X@hP& %l@M}%U Sa![[T`M.ڹ;R_vxX TV>aOIAKs>dƦ CKqd*TyV?azR&[K } bs,_NN銼(lt3UdCogg/9M˗ʫgg9,z?P*1/h..ˮ6PK%~ӝS]O;!vC+fc+lfd,\;utÛvp<^& iIӰ^k"x6Ѿ /˱x{opn< vPP Ϗ7*rV#;4}on[eϦ{Z|ۚ#? $PK;lXh*IqFJMbYZO%reEޜOmV͢ڹEo/键;cC}ORd48!B'c_3ڝWGąGҽ|rwOsԜٸoѽ]5}}:{Q>ťgāv?lm!&5EWO~},[!Y Z;+-rWoywY`:R^̏tc9?]޴?J& 6IrcMt2R]Had*_FɝJҁhDm9X#tffs]P 4wtrsq2(%r3q|{c&Ηwd$&˳w,&uuq-AfH'z%Xܶ0~~|wqRg ËOb1OeXP R:}CQؽEFB -vG޵NPN'3v =\Џ-n^h% z<:44F^/Ob? y=xW҅r_Zl~ ^:~I+׸Sp5Լ >b~"]؟bLl&+4o7 i /2L1CVSe02muvN@}T@b˖R6YKC&-TߞqBSk(.DuI d@W4Bw="ɫj,#W̓GoAX"ىC9Ġz_׶TV1z٩3 4lQf y=tn><˜eET[~jDiTH\zC9`s1UQ) D٤>S#"*4QK՞6w9}HҋP1U JuXhډ,fLR)kNfr#Bݟ9ez Cg͋:-~J秥CnA5~tK*#BwadZU#}jZS ^},ZW_m9sN#_s^0ݷ|Oۜ 8Xw./ZDޔj6jkоi+Ogs8[A](opTt!\6h*!$NDITP^8qLo>$;O KhRY~ٶ*PD'Uw!B/d#J &Zym~a|+QDa뜫 Z&*L5]^bEUK8x8qL8ic'uy㬝y'.3buy$w~x}r6ON9 #໔igU-K&ЌIL5pk5\+ʶ&&T7Cmv~FG/KM" >1emwL 7 mKgAĺ'iRr)4 Zz_dzV.<`l(KtL!9qt/q|I<_!2N6l].]bA/6DM N>(aplGi1>(a-(cߎVjy;/p;J|mVgaVۍV^ 0E;ը&YZ3}b8Ub P8:ڠM>< }!$J %z yKw٣TJ$}v՚o-lWAdlf#u%eZ,dT]}]{oH*sw`~8ٻnDt7~$%K)2 3"[fUuUuկ Exᴬt1A3}AxT SwNs>'a|js7 4d'&yOn'4X/OW'EIqo.w)r}b!tOq7l5rغEz7y 6OgO͒ ,eQsk{ &^[P^kfz˻ OƩClx0[Oc7rŚ_6m!f-lE_S_zu|sI5_ n`zmN=y=uv̸(|<#Vi0"gE>ȍɒbZ@Z&D HN;#\)DgqR>Yڍ5G~.v[,7 xJfK-*0 Z2@B4,>|M2/þ^1=@Kz\!FrE!j8wq jf%t5s˯-p"<Vltّԅ\7zʦ;B24'{" gV@{-A]3eBIjj|%D eЪAE+7P5;S9ԍWn#eC.*i) $Bfn1 C, ],_*aPeV=,:'~1sh?̢8'"?-%Kj /+LrPۅ =6Ԥ?o a|=Z`?N>L! ?TͿ>/L:9M0 ~w~o'?5Yd`\i_^"'?N/N>|7el]T?>t8x-N3|H)TREO~pu߫ySf1p#7%J%X}y2dt'nzK&֏P ?Ƙ#lUR5w3/zt_Pg[差ߟ'?[o^~*2o)>/o愑W+6+?}^c1I4؏nGɋ8Ǻyj%JDMGRSO99K|#8tXs+eϵP"?lr3U(cqnPߟ%ub6T/8MS-NdpqT%vRK.`\2Zf%G)5K~ LL>QS?R>ߟouatmGAO]>h>$/CC#lgmK/y݂!y]ιȶHUGkŽ/ ,}q2!imM(eW((f{fv1wY[)MhGIX&ѕ#\D}I/Iv^JqLVeX-WT)BXܢ;?9WB)\x6knKKή?Ѻ၇TK.+&UIe1lhfb5GBGeijyۨZ\6@th7"Q8!,(k^e :zOOɂgRcyOD@7Hi0ds~d K(ˡL@8FskZEO,༉K" Qn^.:3!thzP[Rx%` o4͜F[9 ֊c?GO#O#@N=F W8I1g[kt;@DPa(  2g^<)2a#eGk灋܀@d4p. pB@)6Z-s' 2Hz\CRSmعWp`v&f]\m(wZM B} #h,H??/p9-{˟FGz4`=uU^@LJ+{kJf=ET]^F@1r%g><=uRQ(-?pyjO/N 6e_j%2֔/YtާP5cY;7n,ϊݢQ\9HȂ'Wem٪o#oYy1p)ϲa9[s\h֊Tewp+Jq5ݢSct>(wRX$k&QVOC@*Sd@ײ>w}>xb͝!=S:g/^Zõh+:PR>>ՠT6s'e4ZS Akʹ՜ #JB#R @w* CypP˨uB&Qt&F 2Q Hgy),CR܈d VVh)2/(A>xP(&FcQ]\dr2quWO$1Gk1*Z˘#R#&F6t^zx`4"[՛ݘwzQ ^x̯A-|fGx̽N`Nb<0|)+0 4pri'xIuhƒqWB #AszHG"2J<(cq?Y ԠɬMhOScTU ֲ"4 9pePpUï*B.QGcBqpi!#:M)/S&wlt6go)#9ҳ; 3B`}!Qѷ sP`1q ŻYmxYeű^:q7\@ (C@ňT`ܳQK4|Bp (1 =˄a_Qj2÷]zq5"ZpVEshmIh);Uj5~,~ʁ~?_a%Aaà:6` ,LJTģP%ȚY/9kww4GZҫmk+zԻGbeZQynbvi.q##[Tʃ݁N}VŏǩpAђs ^[M1F-do"\j,ivpJ& } {KDФ.yg+Y9 eN}D"lz}Sk-uFa\d I[胣SZoLa3weOw^rWt};LP=FudOFr35jTE]x 4EFˮ\c)-S'-8JZ1DJA)PFyU4c[,',)^Iƶ<;HW/iON.,ҺzCNO>,;b+՜R1zDV7 C7sFk0h3ɇ°DpME -_ Lƺd1ZD{Fb;`_5T*$#`d)s^$f rb\GY;SKfWC^*eB43KJ]kZSXg,cB ⣥bU̩2S>jX &tL=oWdД|t֤a:7ڛN"Uz_:YNΗm#e霆kȎ&o;l: heUƩӳKQY闦/FeL(dY )xPd)ZJ7#F6BjU?*Ql:-k+VHICRQyn?SFa& iDwށw %pc-~ػg^u"89¯[c'))QI$)rmcU%x\2 qڼ[_~9g uvX]"x1x7EqwTolhW]w/k5XٿaJw1NP/mf~_^@&[jTT#K&;oP۪t텺6l2] ׷i]al]B֥YqKh2=;g1AZBkkMm'6Bt&X2I6G!v$su~E1G&shubl Bo1ڿƻ5짍c]i"x4΀rVYvLpY::mfJL^;ѹO?gm?t"j_6X{c`&niz;&M_3Z aW p F(V1rVkU/,gg9\X0/3}lAK]_׋}|%kﴵ"ӆfN:[yikgmߩ?Ӹ-:_3vPs҈Y30>cf!> k=%IOw@w-Wj#b,CK:P=kYM̳&Q{-QGB*ބ"v_Bl9n*~Blȩw`evzIZ4 WOY:p)=0 Mp+W"U7p%:W"g;\)4jF>,UܾUJDP\A2FMlphj)/qU6&OF)z,a_-'󽏶LtCM`cUXbsׇ|zn[kEDرk|'l~،d +)#OHK`WMJ\A"rWh&}+VuM $\Yh lǻj2 \ I9՛+gQ{#rxW".xWMZc&% \cFwVǵ%0;>aY{"sM\ޛ-ǟkRna`4|c>mV|a_|Fo Pرa+F+EXZXF9G9GA,9d0bS$År,ޙ!(Abޑ ʵd]*yw4U۾ntqx~qZo=;u iaҝܴ;_v{]IkH*)B\#ncN(gֆTаWT#mJY8IZP$(=5A&CSt%|ٲt&uS +~{sOR]%ÁS{OCۀ<1&N,em%`v':+ -ќ)TX"FǜLT*:(½FIZXۉz8+4z뭶5wLWW dy'ԜSHEƤ\G[ aIR*%{l$؜E;G&t&W+lIh׸o*Ad*PV@=Wj` c0Ǩ?E]yAPTb(r5I87ȸݿQvLQWh($*U~yg1&rɖb,1X! kH^.>å?8y~ S بJkHJb`Ab 67d)(kFnvod@hVfc^Yvf?O拖q+?vyK\M27חr<[Aw^/r3neNW_uJ\us^[yV[ˏT025Wϯsgۆf0ŕJ??^}If}~ K|[ſ߼32i^uf-ٲ/e9;W\Z._:CyF~?m f$Ζ|GG]漤Z/mz64gfw7exX½v2YcIa(闃Rw.Xp4 PH.r輿7yޯU}É2_J컫ŌV}q6>Rѿ/mѽZ<]4]Ļ e_C o@bQU2/K}O6TEeE DI܌MN[XB gKtEW߷8,l4}௥}2y = }s'E<ɛS a [!x@b̶2LZ8k ɛ{zs ll6`is VO&TqGO@!K֘ %q<̃)8^gԪbMSWh:'PTՔuQ,ޮ)QR5f (ֳ j U..l4)(1oUj(pΜ%ce; &S d)%aYO:3)H(S,V̄dYdZerQҙ7_W0qRK[U/@H*jaA4Uvrq>e'!x4@b9` 2wZ:̔{Ĝ%Vy\5DG 7:e"?R;!C-2gZJܒܦ/Vĝ6- 7A.*R1g 1PyST*ZI? j1YðQHQ1Vo+cf?ٻ6rW;Nxo~9y9:ݗ"i%[-nIVXNz2n o81G:AtXQFRrdrC£ʠ;ACtD:7OQv sz ܫbQu7|o>jw;aѣiWͱGG/ 5,Z|= j⸶*+\>O *fxG#}ͽ&K4h\M':F&$,6(h='aESj ÛAwu6|Kt1N?4mZ[Z jDCv4ԣV{[nz˥y~{p4H4alLr *\4C\Foo{Ewk~>6=4ߌ:lŅnjlqoi`Mj;z,j]n&Ͱn{݄Zj귲i^!}<"1N֑f< P9yLr}wCπHaEr@: FYqΜ eDžs]U7OI,J(]JK#:,Fg2'D}Fw AQ;;:e17jz;G o'!Xqiqx߁݁quV'f.-Ǔ&]x3YçԺ7_f҅';{Clߐ v n_J7қSwf[ޕ 7fKJ6TYnG o'8߁Jd覷|YAo.s4.ܟWϿ"}ʜ+pnj.,녚.]|O'X@O`tthhnnB#Eu||=ZYoc1:,7 h@(,-0t9$$k?iKqt:{#CY^SԷL8ek##)B`dgApYE )8$7>"me)BgXR: 3eͨ}M?}Uvv{z *aRgWa&~Qe<%(gPh{pPb^0>}2Ř@sɢY2hҧ̭qkMNɺx,uL,( '+wѐHKI R RdQY0TcgOKnS^!p[z7"RdnO樘+lh躣=DR^oނ'نDzc/h&>2B1JK9(kѧh{SIW=GQKt+^*FiIKf8* ؋} 2)c %THQ& [Wy!d ܂F fEtOu;'}UGmFc)f) I.Ilnst~aLp{j.ua9ggQƣu:9a<|`%]L.z)go7o I:{w?gJY_sWI~K"C_~tZKzmw}mKȬ2ܞWëw%@J}!-¯1`Dv~k=vA?D2#|n>.uY~v O~oo߬i;LKVdv9&ܓnqG_ek5\݇_6fγˏO9MAt5s6+S`R,H*.,?YGeyeOUwKrJٙ8Y y:<(J^yzQO?}=iJ5xZOK` ?ùc'wCLeضv.-k?N]TawBݚL(CoCRXG )N^B^}ɛ[4.ihekdzËx! !܂P˞?{̈́˦@ &wYiv"qmvVՙN~]]`ߎ>1kHr/IdR1Ywn!/)McMQ%tow$S*b-L0D} 틞[jpjQ\jb1MD䐒%ڠGd11D2*瞳#sT?g%/ Z?9HSk^duetDH6:yfB5 '(w.# iǒq+$^^3rMUąG/^q(RQ> 0`{ Sb؁y ScW0nce2L=K+Ô}F)g{S1r}O?l-~5s y*f@6k`ѡSsΕHq͔4}jɾF5Y٨v4녯~~ W8T~Ҳ<=[Dz& 2Xܯ1#N[K$zR*CK܌5iJs>-]2*da,Te!w.[W_ 281vΟhp8^ή~t5Z8G#& knL7c `tT%|]OieꆢVbQHQĦ&h@MGܐnǬ\sHc+Kjl8hvٱ֕v`7x0,zNK,qōh=0<(/mtր.i'bݫYiZŁ4dBH251Eũ 8cVr_< Y;a/Š`<Xm~+MehzKč bjMkޖ@4Dq.c:m#YΘ6@qR\ֆKbP"g0btdC MZp% |eX"P#1:͒}墫,m/{q*:KW$=)946kKhA$ni)rl!ZcOy(Y]y(~"bWq C;k6gV8k~l.K۷EG#aI%5t>w<p<{R1 ǘKv = B^)'Œ ˉi4E,xo}+5FH\# jz"$ cr ?sY_J Ǭ[ w2IH!9#"j>3!Ҋo*AR뺉.z,0#Iq8C,ǔeNdpaWƝIuQ51ɝ+=eFyT*Zag-wJ1{& Bːd i9"‰E4g>&U]+Lc e )q!X`h@uӎRD< B9㝴VWkl*yB¬) ![%T Rv!e匵鐘Wbx$h)lG Sd% z>;*rѩ 踸{]-Rܖnt;C)tL{lƆ/i}'N;a ?Oo=y!裥'L;Dz^Y}<53<+WC٤Ey$n#9_i,fE`O9>є,J؋Ѥ挨aͮ Uc@tk^1^EfFE -? _WW; #狏ovO|߾xae/)_/V0z/U=i1Xn)4Ms3 :-n'aQluY؂h>2ʇO/|Ios۱ԗvJw7cO9 q1.s?)hZu S!S 0 ytymΕϠE~D'G-aVZ7-mt`LJujZ v@픅/vtÿ~ylơKRsO@9< Nu>v;!N:Z*{t޻a,kea񃛽,CX>fnk8Tt܄OM9b#/m`~o@voOW#'Q2HD}aȼDÌYb!43Є={] nf@IÏ+e{1߈ov:17MMsfo|m8*_|>nX0&$>U @uocEoce`F/KXBB(38.C9 3R^"D'k?XN.y Ǟ\f W(BxzN^wxo5\NPP-bb:>z) D]e5weqDc&_c]sqR7D h)a^?VoIL5g1wW2<-ˬ<ܴ,'a )؏ǭb)+#q&ۋW~uwVTiq?AzGBLzgH99 S1s$?NnahM!4.꺭GP[̇7u4޺^}uzͲ_>n֮+]#)L.LG**J>kC!y.2t/KﴉZQoݱh+*Ztii(B3Y AcH#2K\ţyxy23-Y3s[(1BQ;˰5A;d碛9|C$dx쾡GSZQ (UrcTzFS}7ֽo}(z^fe 2 08~9rm߹ݯ_aveb*ZmqA&B?F̴%4D' LPO!D1q\vY1f0nm9th8տ~l?Q߽D]KҜe=f,#Dz1ia&=evESy G~W3}p8Ct.@S 羟$H{717F8"t>ڜxR;!GyL<{cz"['~&yl 6Tr<ۧs2׵|G"OloGno_҈q{ <{1DŽ&xYnqqwK$sdȴYq.wYqԻ|w7A?]u/_ W##G"i/|IHh&+%|~ZCEJ~Ry|z=~;J68:":syEWes{E90T5'7BG+ZtS,(q{J q!π=ʂ{W㐳I"%)4~HCqaׇ{RV(t;yw0-7|N0(lm!-(C\zꁰK;Um=ʪҏ_v3jw 0)|rWii% PZ\JZ "BbZJ[fHW ь7])mεMWq;`]).;+Rڗ+wx2뷟~݉D{wBa?w4{4%byj: rs,zϻwWs3ўw1/}0\}8nrfu3/7.?wkW$@U?^/Nrߦzs9Ge>keg]_.[&Ɨ?nƧK~`^F{= 5e}6e1e0nwW+h%c7#/[wnsz(cCϓAS8:&=wcJsvg FKm1ȏ {3M:vzP7041 X2=`hkA#9Q\2\RDmkB~3Ozm]Gkt\DUenrrSQbKRJqN:.eu蛮6+ JS4+ Ji)Ԯ+t|HdCRucKqȊvbʃ MWѕҕdFWd&RkוR&tA]'^b`o'R`FWJv])enu%7+Nv7ZѕҺXCu\])p"3\gEWJjWJnuɠnjft% ؊w2+j!t%])Ŵ+ MW[U"ms3l뎎 Ҵ;{vhZiW_JS7ۤ|I^L|nL  7RkaTe8hQs zД'6ڃ2Q6艓cL9sRF3J?gJm,CgR^]WEx* +UFWz\eY|X+lzc dHW|cJqZѕzW2bu2"xRJqc+;v] a@vtEАѕb+X2 sHޕg"3R\JiSAj ؐdFWkg1(abe.5]mPWȖأvuft\ɠRhڠcvŵ])m BEW1B;! TgХ >;#U*psĠɌVn5mi=r h 0uU`]]J*UsrMWOzĆt1ѕdEWB_u|tu]9 8:3Rl&rJ)97]mPW#ҕ;o'Rܵ_.ͮv] %A+vJWoWOQYRZ~J)9hڎdvfȆtޛѕ3RTޕRfnڠQ oHW lh]pڽh1׮+$hڠ JŠ&D+RZuD^歯A i39FL-n,-8%;') kZ)nR+56h)or&Ey bVxGv.:Xi33+v+ 6Uwj `dP])n2.|TFyP/j;Jѐx*E9+RZkוR]mRWzI0#s>crם(%Gƅ(.uNexsoFZk a6vu 2,A6zhʹz|0,(p'v2ZkZPJl-G?}E¯2ẵߑ(]IJMWOzL J]).&+RZu>4]mPWA Jي6T+휹$nR4h)vlg1k'ѦT6+Y!] 0ѕ]BוRlڠXA 2JpY *^WJY[ 񦫓JX\0+ތ7F+RkוP&hڢΐٻRVt)׮+<hڎi_~;;R\VtkוRD'q܅s3eL iZ|3f섖Qvh:Kb1MK{~p?|;)W̹m#ɮk> d2C`QbYTDkS$MknKzڀeR橪[[էAP5P{wԈϞ>x>?&~!Ətq.iEԈrRovǷWhu65xw6:GǣvwpǃQz9jߚUzP8RgBV=~FvTvb^H>z}G8 E >x2|̔ Gf7hMz:al5Mj{b]w\M=ͫd؝ؒCIilMHΪtN݅]~߼Ƿ9H,uvcB30PRr'] uw5R ={&rLlM:.N-iF;scfMNthV՘FTU*a>\MtUuލgJufcH˝}ОuV򭫊Pcn{'dRWYZ-hs"tEK1he%RkPTA*FZ0Q5mЌ}׶WzNѦl,Ztm1Z@.7o^BRR̥fcNysd6CIqiJM#$B"Smw TǘnӈfhE^3i*TT rz7{["Z#)^_LuhQJ=@ܽ㞿ʒ&S$s*t&rih*|/Dޡ͎Z၊JGD^W*=IM~ݶWo >MaYnj(`ɘ3O B>5yshZLM%5xN5thncP9\:'t ^gcXKѻ)F$ZBGZG7H }m fM^TCQZ\.`5 j%}IX(prs^QUbKT)>$rM NԭhcfC),XXg dGhOM6duGގ`Fڨ/3~ E Us`\,ܢ k v**6(:ݠ-;y9xC9G`CQYh(QJ[e<*J|Yɠ-]. kBhcnue+y,&N.Cqk 1YXWOWgjm Qx2> Vc;uIٰ֞wnPѦ J(}5 EɷVz2́yA1%q-Fb>* kRTL `6+(\e$ QAQ)tJ%Ce:|=⁳d2 V첊޶X e=ಮ ++ȸALAAXg-P C$X !a@YPќ=4vҮSwD\*:#χPɨ[s6 : s<`&b~x}}6di*S,7XktdD0Hc c.O9|9(P"Q)_C݅Zc@ az;H Hј j+`Eg}Jh B;ےڱ", R(.f94D[f[v#h59n8Hd,@Gj[? t% HBiv%&HTyP*Z(o*"rl*jV XT&,*BȺP>fnqIublWNX-Z?@+`X!]gѝmhf#KP056#4'zu+Gb es>`t1i7v\7*aՂ ٣;w669 6IҢGp}mIHu53ZSP'ՓFCoׄ bLP.z~;?AzQ{20pPaRDluH5\Qr#b"E'%\I1mA;B 3 R)1#K3HπOt0 XFշl06v y߁y 8McP'7 l!G߁] +UA ء0fE5FHKH,Խ7nm~ xCڨlJ%cT-،znzgm͊؁dXf=ش=CɗM{&Db2 *R 9Ok=F_v *Mf,ƕA5^5pg^q5* `*ga2ي҂N3 _B\R M F:`=[JFDhikJr\kCW$ЭwE< \T*΃Ec6ՔWNDL0r( ;fAjr< T$b/9y\V~Q8bstyp7gͷvnhmg~hջtq6z]a%30gF*A:%'48œqmԏ 4 @(rTP'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N q=Y'и80,T@cSqmxx{zBN $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@zN r:)9Z؝Sd@x~N r7zN H&c $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@zN ΞȡP\O @@8('8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N q='[}3؛_V~V.o7/?\P/Z쯻_L (:%fOǸlOƸfRR{1.=o#:%`a] F ] j\8CWP-|a ] djl^hzt5PztaNS6>XZ ҕ[T Q]-EW=u5В{t5P]=E v_ 0~Y}7< g? [x9tZ)>c|]ŸvhEo^`pI ڜ=7(W>}OEŮjMy~7c[/^ЮW ]+f}^^>5ce z!ڽ|y?a<\oeWp5r}+4&J9i^Pep'ay}~uW#w\`Sӿ{ᨛ>QQqʜW'}  Cv~-me&WA]"z4LjWkd覘jƝHRD,)}7m{O5A_NCb'4VϺ &mf"I⸭ȃx'A,<gDdCoS&զMwg?Oeia5|*K -Ǿ0P S_Zwz\)|w6ߙ sߗ mN߆2EW ]wbO`O=koGdO#!`K|^DSbL Iv $DQ#ı4S]]U]U]]vE\a1]\^\=GqEbs K1oꠏD"rH/?p9gkлHSTJbaۃw^N:UI8ulKۃ۬WmÕ+abb`_ .{Y:7P30/jfiCdb E@܌J9飒>WtTRKK Z[Rԗ:? ʻI;AGQ[U IeE ;b\Q+w?&xJv,p4u8TnvⵚncUS5LjaZsIvFYCte fە5RPW֞)ejlK@x [#]3 lb(+@rgKwF\!lB,^\=Gq%)!qS; +ws׮K?Gqr vHBtq?hD,-k/D\bwH\!;ƅNbk^B,-ߋg(1;$ 3N+vw* v/ YCDfZZcFRq" ċUz1]29-t"q/]d"rIfQ(c$^eB&"ka~}Wk$HoWk%Շb ?1.Ï gX5\%v>ͩ=`H20ȿ~GbNzY&/S!x\[7BrR]l.l-Q/膥x\d6_u@ZNv]'6^]P~iq5OnΧݼ=(_t_)65)6pk3u8OWSe_9{ ̵0Y2{OĦ-2F2U3>8 z)L %(呅¼dS+1'dY3N$%3<3"sb:eiĄ -QkΤ3%T{*a X6MPW #0گ=K5?+ C[`#׀d$V?W YL)yo姗>9G?LW"JU8l1HZMgpSj61‹8QȊ+`\1<ۈf¹RWÿb^h^Br!L'fB=ɥ5߻r4^__y wBT2=XϏTYɃDx{myR&UL?^MMng0hI.ܡ€wV跛UߊZ?~a0.kݾqY) $xÚ+eiTɓBSj^rtފfŜ>ewxo3n`wk&v3RZo &\TBƸꐽVH-6kUD nI6+S̙UKIn]`NJ+f%  1&d0A5aZis:bkܭ#.$i5o-Æ>^ t?u>yؚu8̫YjU2uDvTjqں[aA_YGoIkFȺR5o*{+rIwjg]A̮#lo"؟-o7hgŰF@ Źa~)O A}YF%+^n^reη^h^eAI19/<ʚlHtbFlQ< X.Ve&l ML Eʜf傩LU ቚHT ƹLlrkm/z](`ie.6`n/fik 5䫿6+ upvuXS^Kj+ hRJHRH'D914imշB^CNei*h bLBQN" D2'y#.ej#^`omKz5M>78CpZNRЀ-v-%S Ę%]@ 4`T\F "w)iI F=g/y5cAך1#r<9$E$(0a^Q%g\h|*^tvo!8yw!+IN:,1|0GDưDi\$s}k\PsuYhu6TB^JQۖ]ߍ 4-Kmqd'Ax&ߺ#bb||OlyKʀV3K'."5~^i< hBxjZ2bXuʕ'4*iS R$q]d!'UI!#7ņS, mF J1Tt&s5eWJ4,"? /k` {rᆿ'#w_pxҙ5JozuzBHQE|ٻ×W88pCF|:iO~(-럥^9ތֺ8&כp 8=`i1b%[-^CE9r^]t{j{ K 3,vnU~uz%ЫC7o\4oy>+/1`^*bFfo8Y LҺTO?#ռ@A//Hy4&Ih8E }@89~9c|=:NCR~tC?Jcc?};[D !(&a/(5ϼܒK-QlJټwd}'C gn?F CX UIyAՠ4.Hc 줖NАVJ;5Ső˒?_^¬Ӆ2 HptnM};E]XkE;_=Jb4j(Zf%% MO^R>dBfRh a&S"-y}@J$oRK$`${$UIflGqw`חi&j7k_o }[!'1lY⎑% }riV9Fg#v,/;wOoGvf58Qֳdy˜"3Թ~r#h0 )^Uu {{@ʶ1)8@Jv'΀d3IYi/tAgɲC.ZGcه\S*W0(fk@=e2cǑDau!(XSJ2+#knټ]]>:}_39w3$HόWJ) kLԟykvI~?yG!se?nȶ]~|W.Nɯ/..^$~7goeJˤxO%ˤ4kd]̆/Œ#%^z;{r?zw2HuPH1 ]z?ya>maYRVk߄kuLpӳ juroc7|O &?b}(5uqjфqrv%/엯'_^ #7*.א웇tc_Sg鼴xXt> vt..Zm4r 8&#7{WXW~ G.@ l##Hځ i $MTPc2f“NErZo)z"bV9TjL`"8B8k&H/L<1һN\Xs*y륭,Ͷi?:^f,6Fq].c0Jqr5I>;^4vE.!JO^0 | HΠʘMn&7~Z/*oN-Oi#x;h4r{,y4M>ټޡG yr9W[v7M&}l48nWޚ =-__C.'ްYիmALƳW9Be؇K-#zHno]p7!-סZNG{SBM@N'hgedLZFG8bSmTo]5ogI9g- ˘J+[u R2j/J7T#z!ڨqڜڼ&;fS_G=1CAz۶vXGJPVIϞ]ܜ.ཱVA)r769OĈDGpE>y2h%tu:e*)Zd]@8oS92+% {UG)BD9jP1 d9{W%Wh]ћc1is'Lw}2pC@j,wl60q۾ FsuzE9kpn9bԗWln%~ 䓢.GKPv*ҏ8;b1T*{|I]n~Fޮٙ?BμP06ou;Bwyvk*T7 O9{ ׮Fj./ 5*we s#&wJ"161j߱yCbH09x d/ M.*whi@!񅪄(c6CvRw&)։a*dMorfq^%yj [\E @a|qn`Ǘ{(5:=dI&kW,Kq9cȶŮm8iἬcN/ LAſe 2A"HTF!jԺg8 NRut0tY4_CJK}y+WށCh\6+FF]zQdVt_gg_4!C2Rs!5]ɹ(j5E$JT&(s([ѡj3*|a3x/4}}^}ί9󛊌iظczv7K~oC/ofcl%& )gBbq0x3'6X6>HЖ]*oC + TMe]:ZDzc;HI{[64[ismGkWgi3V 0ऍXپ$%8fMWB(1ƦBh@s4GIc_I$+2pSI"qPM.8a׹gc<L?]cF8z'_Kta,ځֲc1>j{6I煓 e$#9:ͱq XcBM,8EGR;T{f0%gʥCu6ӒE/GƳ'[l0TcO!ppXJmbт$p#H4D ҎgC%CA{pa ˻USEpc+U?CaW?K /g3DIӿrA䵓$UQBWXS>oEQ-ߢH2X1 qѩ` lӎ&-_9tѢ5AWDlۢK> MĂI>T&0|hGЭa]^O)7 آT1FoFM:-Ы|0Øu3W« johXº 7|ވ92s&u0y0eMvdm _'䉑wCQαh4l8 USd/)EZF$AH)fm!r>j)(QH|;@m+a,1l&E7Zuy~q.iQ HUe[HJlE{up[!7J,Fv ٟCC|떂Y k5&0<ȑz ^Jde&Ơ ap6Ka_AtAcȰ9aL, CP:5bDPBf϶dWEΎ-ͣ W4IPc 1DRjԠd,B(ml쐉{[{yA[5D2hLV:j!af8V~+UZcJ~~`*9ϮM߬ NUlϧEMv&m'޵6#׿6&@M!;`E[wdIki<O%˖Ge%6a-*)V^76Q5hmhUձ c7VGk,, eԑ2h11)5&5(@Skϰ0 KDžͼ+ML5--b ֆ($%D|E _ Z:4rgmH;?x3cɜܻove\ Q/͛`Y>ݳk$ů؁㳉5Gh8jZI}9Tǻ/V{=RVǛka@aߗk@ւ Sf)]Рlr{5,n4V?C8y5-=/ BslP$ e%ԣ$`"::l@'1srْsq>q :Ig]max*,zP7\. 6j$(B9=yMW++GJA7jv7-JrE(D)-d]D1ȸڂ^ ThrGEdS 9 ;ZY(JV%8ngy_ax]uN(~+_3|k|q{kr3n]q~Ԧl[绋K}#PJٚ"FjHdlRt/ȄހYۉ.Aw0/?MPB?d`f͋]8߼ ф<-`jb;@1ک>4P4PL4l'e4G1}#ܱCU˓I!\.R: .t!8e:'bg^y ŊVLI[fBRxX;z8Ϛzlrʴp9!=q˺ѻobJ'B- N_eN315\[6 3Ee >IʾIHET*RLf{yAwYrZЛ]6'ဂ2E_*-sr)lH!* 5$B]L,捗Efϛ5Y/s޿&l >ڬ6.wҠDTQ$"k9mRV-Y'mFRIZQ˕a?ؿDUCSwupK]jAt]S]jk]#6`ml(XJT)Tt ٓʓ)k'\ {,' ~$ּ-PL!%[D: )Dl+NlB">:-vjCO)_c<`zy SuGCl-wE*YGF8B0r EE"i%728/|(("첉㎿[sǵP?zvKQ˦+۶\bmLN]4ŊR6V+K2cG!YIzp ;)–9o-1g^Y+]ʿ>h?V;Vޢ1&ȖEADހD ޜzRn~_=? Ic$ᔲuN Řً𥸨:CN$+VgVz#3+cIE#fɆRBY@t0T|]r{щT:lw&/갽~}wx?M[㌢@ocT6{H.Zߚuƈ9k[w^kjy0?٩(_g".x} !&yɾk*(jX1IHDfAKf,&,6Wc5i2=LB^c1uV7G8o/iK%+Ohr'UIQcZ'̻ilVآ֊SolQWL6PFFx1a%/ /o9NЂ E4Zkl `YӱzFI{(6 ]/!c]:Ib{)>@BFY4% ![ R6eR=>u8\mRMm% 74:햂4X'|`UL-; 7dۣ_<'[Pb]`,vc?>mhq=\xnO4=hXl{`F%*gt1MT6T`Pa;WkR'NפN>أJѡ#|= " E R(JU(gJ@%{v)1T0N?FC]Mwwqz<2&qICMP!⒌^*Fq{pPeG G1 RyG$Д3Zd%A-07HEI`tLd{ {ok͌=tG~yvYrlYYNTmxH) P+`Y!9yk6Y^Fֽ OGgr=̽jzzYsJy(5-%2P WHVl”,yYBoy]cUxtF3ol9޽DX DUJ5۹d]#.Bv+vL^;z[uݵآ]8zjQrFvC|z|aXdGRF*rIB= FDcM V <*At+Mڳ׵'(0 `oƟo_TsZOKۻȘ'4I~o%?j{s19g%H[y6i3iRל[6/8,2σq7.uz2e-t5mk?^GS}~`6er9439GM_^3Q/1y6o5l}DV=uk,O4Lm\,_٩\[F?C{۰5N6ަ,nfЯl RmH^:~m)6))4kdhη齏7!K3)elj] /́>mY?8d2^rB%eۛ58Ap3ٜ-|>k|v5Տͻ<a-Cz'GƳ'\pNÛ5،xIi<5wU[n{U`|MOy#w?kzowLtC<[v <u]D6K#7ɑ#0"@ 7F Dv ҥB^YQ@AN2?^PGfsdiWٌ"MK}:u¥"Z6AK%I)J*\4'Eؙl9Ku̜^ա_xw>WJMtWv^)}%[%ЩxGHWq󫛎i,4чҐVRE/ANiqr֩43!لF{2 -b_pXBYQ1eW_.¾E$_;6UM:\_UnGw'APRd|A(@!0'3 gVHF+Nm.i zR҆e$ S0) d[(kIVGVDϝ-i gz|=6Nޭ~}^O-;pE,Y*t,*c;=ƅdN4F 6[\w,Neh-;0,+QeM<#`&ҼDEU{eB^u ŤW9H*%zЇl 0$%I1ɒuː'[YS'>lDz6bʛM"Q9%yH oV )2 8F.%ILM>:f䭘' BF_gjL)o\[_rr[ d_f53jč0Ѩ4Ÿy&JÆE{Ű׷dјG~rmF(4ɐ5~1?Iv%Y0Mp!L'mܫ3m]aPswY;_wp%YN}|װf\|0vB{SYj]Rh|W`'_YB_ElAfh Ip&=ývBMمZKlFW8=y:Qh[FlGSIeLBFqI8O .r-ae)JT;Ay+~^=?Mbf5W9<=#"M'|"0nr_okٓd߯-ٲcٲԲVI>|X*3v|m=ڋN絯op֢/4G~ >U} Kvߓev* r܍NWG+{$_p!L3L?jvtGԁe2:no.֨eߵvfȻ*2#YlNW40iPVxF'"SxFI'[c2ͽ[d;M;z5Dh "*!qhJV䥶ӈS)!eLpkS{[%]I=ўGMlĠG|kĔ`pm$k)@bB)RgNՈ5$”j0TPB7)`]-R^GRfǔ+1Bt&&"=PK)ՠT4sm29mKL*Kbs苳<֚ +a 5EȶHFg :z!$dyIc)FO)Q"m L7s;@aԹMP(G 1dÄ,c$dj>eP)?fرAjƈ*o/zQ)(R`1Ԣ;3Ւ M[`xupܭVx$<3]nH:F笂_gog˶tWw6M7]MG?uWQ+:+WtkY˼ƽ?fZDJUżlY[W:sX;I97Ưu7?,B]A+0db"{h.N^[n7]^kaRnshw\GE:n~_~z}|reC-.|ټ#ُyLsݥwf5/{+SS/.g\-&7=K,琛7?hBbx|UXN*D ;]؄L RMpX0٦V+{WP~'̋$𭗅)CJF03!c@"- I8_N7hR`}M!. -()DJ:Ee =`=@PdSv#%kHKJ)Г M,.ORko]㞜'g@Μ|Ư A것!KD BsuyqIr"hhʪ-R,B'Oew!IҬ.=3egO>̮ڹkfiFKZ 7vol칻JjP&~kCU⟮OS !6lv3݌fw4*뚾k_>>[+?ShV4=̏|`yV٫JO{|sP+R+:T:QQ*'4Q*i^ҩ>j\CJtqh{TX2zd (\U}sLP*L>8ڎ5^v#gl5C0ET0"NNHU&#ZƵ>~Nsd3tPk̬Ǜ;LJmm0OT0T5OwnѲ; &G频bp{πI>ؾ3`QP^IFI-TA$%Q֡:&wITz뺪lZ49:Cp KUDJɌRa06{3gdTr&gW_`Nٷ_V V[y7|B0*8?޳F ^›⪽g89[Jɾg䣦ܦ7H?-cqv=QH}3mo 'm{oH+g>->-y6Tr]č9{v W.7]sS7P^rS%=,h8H=/hL& tꕬxno~ \IgܧeхgSe/0"o~P)Enr_ˮ\v=)zEh@1%ӎ[+"2YTi>dI*hNGbvYee_$øb|=_}-LFQtR̺+b3,"shl(^I iפ ks%CPZUF%ʱ5 rELhg͜;s0WMϨm&Ԟc=lCD(2ڷ1ڑ 4+4@(=df؋:[ƚE4*4Lk,|HTg}x؛9Oa׺')W` "?ED3" 'D|tY"$ AK.c7`IdNX%UvDekva,o\JW TLH!d:SI+[ n>{3g*27u`\/3쭗lg\ '\|(JfY]R8*-,-H8>pq0xWa=wbJs|, n~3'z׶M =;XnlJ|WEQM[E1+o (&{b( JEP+0[4腰E~(KlE$df$F%eg Jc.$L_ 6V t_lP`{է%g[y{+ߪN-J4Ì0m都]ӕX%1_=_Aު5Oxk؀谩SLͤۛ~IHӐc(%WSU2S*F!i|3)YO[ dLR3@uA`$2KVgٛ9$^kI< ^MSQb蘋BQ$\{iI:#O١V>I"?o)ȐF BIK#E;){;t҄F!x(ֆF M4K]P K(L£%-RZu $BN<S9IZ>9ej l.&#/9ZLģ("L@j"i h Yk8l MCԈ4`-`4\Ƣ1\rJ4_0!k \\ՕyyNϰědCtQD/B|P(o0La*4z)aańDt^z3j􎇞Q'bWbEK}iiIPMR)P F% Q`U*kE/sАL}f=bʽԛ0}_%HKIpY-We,eC =& 6xv@1K S5Kn*wo(c >6}_A}{{9Pt24EP@.q/e_tN2EZ T`/=[(pj~BB^A, EG ;ݑG?NB`%,$ѩ4Nv} 9!hRpS?$#A_ |1le1B;0qatfDM54ŦjcYl }Z  8"k3FvZpek.\U \Us-Wl% ^#\:++ꢖFn(dѕ6$rv|4qm ى]6#ճg_avy~՝m`#U%?#Jkr?ш` 8'U/: ZCj%N0*a4i뫓E쳹?KM463Yz/ \v<yv~_|v>: C1cT|pCOA2+E徆0`؅h@\N=?&)0??z,Sj!QЮA=f'"f9}h#<ot0`]:4iOQ9_RTvESѲ[`M+cPU,4k 6NlV]7q4Ke [5HM$RSqF@zF(tٜnd܈\l6DBj4j- ~Z:b ޽},nW2`?\e WϳayV\gjۦV!GW`Cjuc+ZiWډt/j \UkQ +Q\ҖgU5XUrpUT8++ a\UIn4p2^ +d&z=ph}Y\Uk&zpe"̝r4pU͕j|Z S5•uR4+ \Uk:\U+jpEڹZ ہ*vME~eۅskw>㹝V0%jO̎j\TaAZ1\+Ozl5W&=Zjwm$I~\bw |qc1MIJ>YwW˴QY%[B_-ţ^u:;sV aҼKPlhw" .̬V~&H|Gg ©2ǧZ0ԕ-oz>6jѥpGiM̴2J+gճ "jAθZ:zpDk5HJhAWCJ#[0P >UF{ݤԺ+p ]eet+VCtѧՃ+: fϽt(Iҕ6T6]v&w8ίb8<7K#?nlI'|boݿɋ 9-@ThHxII>)b E/wy6,_V7UE9q>ooQF;fZ2ʽ ":':$J\RPXTYi!}'.*/D6_>_'#,ygdqWjys$ԫ7Ka~~u]XInhy܉M\Gy||=Zۿle0ߛ]͹mμx3%s̗eh:c| 4"v,,3lSx젮GJժ/R>(VMTٰ\p>O)*O,HI@6d"e!j~m?S2c` \"eB^zL@'+$v)AZe2 bԚs:=`A wTJ{d8QzY~?5GsTt2tzEѿX $+pQ|LmsASaޥH5(yl3HB\h&6F$pPXk6q."bd( 4(CS0$@ömeiS8~o=ء5@ҹ1ᜅyNU1KA/PO@1hc;Z&_f1\BDގF91YYȋu6

b.ΰ:~&,j'qqvcGC,aYd6d r7T]2וpnŊd j ,ּK\cI]pQg1A=f+;~}Z=)mO ya,F5RnpS{,>cP?I7 h =k빱RN%. L$xhGsN_"S{_س͇7(< !̣|pwmGv(h;xUϋ\3Q A;Rl{s$-)VO=|˞ea*嬥#>HUD Pi%>Wg7^?RiYtgO|2Ґsghw9L*7b0ǃ@$#hp`2Qf$I1*nxSo}Gm*(wUxb}AU[ڮ n+o4^5֎qm$ aRL57:Eb xCx;t9=h,l<$h{H.Lgީ֣M841!E M. Z^8l!n [4"S⸔D5M |:$9jP(%HE< mE$.}b"92zba]Gb% 㸍y\ 'w1v'\R|?<eCC۟^Z/YEjn}"Qy.H [ExHN 5Yh>WZF)e<ǃ1a<G:ɳ7.6):R@RAbeBAE^Mi 8ϻBh0mhQ^@ ЃeF1Jj(7I)}Xh$[ ~^a# ; ݦp6;wO}iGjjҴ;p]Qi;|elK[{ͯט܌b^VkuȎc%6}qezĨўƠ b( pR8r*SAeh/:6b<=K(IuxN4 ,Azk%$p"C^jf<۔#ib"'p:pŽj=]{7{QF,{q-qheB)>XڸL{l3$h#յGжD5su6vfTYهخ{6yx%·0UI^=U ާB MgL ^sԭg]m+ 3ڒ[[sv=EƊk" Ҥm& 8"]}r$OK}yNf&ќylX'mZȡAzAyX pAxIQ^R.(oQ <e=!lr&R#-5ZwcC_E6..OFf8 $Rp]< Ѧ@D)yѨ|l'Ӑ6wȷkկԂru!AqZ5  o^GU:j'vݥ!6);tr\v~r= %ǻx;o|ǫ8>[u3=BJ8;{7~_?Ǔ+c@Vͮ^v[gF \jUfz{$OSζWZGRZZ|['VYX{C27LCσ0Ż^k@uZXZD OL3Eb9̩{]oVW3Eֲ$atyJJM$*w&@h)Q P)1ZzՌ}2k} ,;ѻ1E^3f>_9ylUiGKcI=M P> ؘDDZD> DQl;<.)p99'/!RAKJzrqBwO|:kcW |'ec{Ƒ#0vv&,.-n/YcЏj[YYRDyU,ˢL˔٥bج(Fy"UG8OLLh+ѣ,e%G7C8%3xTϑ(K$FEJrnD;ӱSәX-iZmKm3mH\gcC|wvUxY+)Rׯ%]\MfӧcZpKܵzZtɽ3*H):C N/k\GWϟGq9@cNٻ@YȢ`-#޸G:@x=܅Qu +Onn:|Is]Cj[-Cp̫ٲ"NuP>}yh8x $kN<^Y!9E$7<ڒoHAvAvE*99v|+;5ShƜ (Q:qKqV!%Ŵ`L։ hk熲G) _mJԚ+@ciq\ۯ?|5? z5%3LJVhфyaptP!@{fGOpࡾA0|yr{C $2 B*q7) Q A Jo _n󉩏~%s˯S| ]ep-qYvewe3:0DĒdMD@{DfHQDC^OOgX^AA$EM^kƌHs .HR EtLWTE;.C$O_&#qƒz@#5,syh'6Ip@:1p}o\XY-50Es.ssӇY ^wos^ NdLz RWRD|^EUR@c(;c[ (fw=3#&g(9 ,ם6Rt,mi]p6 =[|%u'>&]4b$'T@&ut8c[TW]nqZJ\#G/z}1qʏgaJT󮏌Bczn>Wܖ#?F춷GFɾY,P9_k`8&vi˚L̦lZO]<gNn C?#чgciVns>i8>kGᄰ1{I%$=*SYw<M"R?x`zq'1$~C=p r.dDIZՐ9xS_.ľ:Muk< p&ӯ]xxw<eTA`Q@ Ҧ+pc."qF%EkE"uKe;Ru΀g"$0"@}wT)Sq.tښᮯ}5WckMQj+,Ks{,WymuO vk m*դ(O B%YYBSVceK;ӇT ÏHXPD Z: #V])MiDzG=V~ ?|05g@# 2> P8@T'nw-0%Q6NGo-MOH_$po#,c2Fmq"n]]Vw{KO^趼-n:k]L5R!/Jx[lphЄ4Z.)PȕL0I?ü[1NAE&%[Z[M\)mA'mp"+$nКRqe-Ӈ's"($+Bw4ߞY?>"kJ8]ZI_T! tL6E^`jGMȿ_kkH!d Eݍ9ك} VgAa"$P/#V"9Ι48P"jD d.dX<+aRv,!{me#w3ƌU:QU*ԗ?ٍ3ߌJ_MmR>J Zi.Y<^t>CQu)`.*e=Uu\܌2M',o浧17̖]Y>cף)ÍG$?VkWUΟ2VA#jX/57z ߎ'Irt;n}hT {Ujm9/)̾^~γ{J~P&p[ChTF#C{i V &CKPC t89T16:hN!2]*KVsrQ"5 t "14X!HPGEXg!D)oKgb4ј >3sd8$D)ĀXX-Ea>Y&P&%JUKeAmgryN!чHv ʦ_\x0蘆f=l.:fm57UOX t>_tqy{?'9O#9wYwOi8+SԌ== zV=̷xa5~ϓk;}k/ ]jc}NJT*fts;zwmQCy:TFcA8U`|uRK-szj"siZN *Lr QU|I Ө9ͅ%TdKmy`6dWpJ& j %ՉKYPSڳzCc^}B^9Nz"ZMZjad-uFko@FU5Gyd) ཋ*yo-h.Pd8~4XG&ьK4U^F51ogy_<ͨP."tUxTzC;;?].*%ZnAwE\VMҝ ljZA+nAMTPxy/)nBnl̋%ӈnsQh0F.H&D'*)=1޵Fr_A2. ̓g0X`XqaҭiYU}eJRҥ*%eҰ꺉I8d09"4;2Vw 9%Zkq)%cRSG3~dyƶX#c!X˯NؙrO.o_ȫ/ t||bƙlኈ.*S#9%nL'7KVKKH ;6Г=J"mMt>;بhn!6KRt-ه{4sGl9=)k)dP{,ȨfԞ]Ohh+WI?NB3&s4g\T LAdT#U];bl}o\6J"bIs~A5ж3R$:㱆51yCJxh]Q\ n~|~*G~\.%}8B94ЧOߍhK P-}x1Z~hbO︕-;V)*ITÔ&fg=>@l?j9w'@%Z as&\RBX+(ue9PcM2g/pyJ3TJ~Ei~ׅkvN 3io?{wכ: ؇=R1B{fU5n-լ +T d(^Ʈn++7?/ t _@9zp~EGul8PPz{kc߽ʏ/0.`N~Zys:'2:Z 'u zh#_]#Q+~c@ҋ|]CGggUҿV*ѐ^_w(|l0 b*mcj%["nJ&3^qګЪ:wowrY[ܷ+VP(C%rk} ոT*Iɰt?} rP8CԡH` u'^|$Vz>-ب1v y#{TZPޔЭ 8҂ZIsi)-o=~݋գ ҕǙ/\ }k㬌Gp7ycHeC2 \uk:\yWW>\ioછ5WJ;k+4}bWj%:>7Wj%5UHmvydu,/[KDV͎ï~^5:7ys)jډCqhbnC Gz31;k~w|/J–ߍvuzzv?߬a"t"ѱz>!3T -G0 %dlN=c3SD\k;8Yڥ;kfTS_n^^Eٵ (=:Ok,t\ϻ1'D9%A).<,bJjm*)PLT ;)1( %%gkBD&)EY4ȸFqB͊xz[r,s?!5 M6~;Pٚ*(Ќ0 >z [חqMajM{ bKwϏ_*I;UUÏK "H.j`DXBĀx LMjaĜBÏoG=w{ <@8d\HxR1ź<8y@8 U9y:opH''D^mNvJiDD{lgSmqp2;eJiϽ lKV3XK}&z99%pŚң!wxvz|[ɔ pbm_[riћpqBvCqǛ<5v>!"IJKlXj\X_ bުU(&:hSQcd`{j5BAz%LtDjĦely`g 2H =RBQV 9sᚢ1b[bƉԖ H[/EMk ^n\zl`pBb$X H!4}4րx-q1>I{hZVnWK ļM)rRFZ .PSKr U}<.6Ƞ16;K+ JB76^4_`Ѕˁdq1G={7BOaǡU4QC4m1 sM0|+j 5$&_dD IgvL'c-ÑTYP561'xkUzSjlc'\-x>wgc+a蚺c/D1-Ziɰ_Ӌ8S,)l]\h!Z5 >Km;E虀>{$(┻h_ne 4*5 Io OE cKk\Nu'\(|v!ZSw ZR_؜1a_~"mO?}aظ_7[߳/ٷ7B 8(+!P=r..cܵY]EqvkBr)kbJAFãe $bZ2ږȤ!7t5XufβLT̹KUlWi 4w-WkƩK1YJ0iFLrxçzme! Մ^gzF.smx.xnx6UPo=92%$Hb+b"&4[xX9DQPiam{FL/s+pQ7/х};?_ )`jqS," )\Q (U`1sd[]}o,nE Wݺbo^*8Fg,yl JokkɥtmFK֯>HHU#)ad ) &R=Y)KA(X2\hiᣥw߽rt.H\ʠ_fU7wum.eROP>'u}+BeZ?8nK~_ܛNޟ~>YkĬCʿwv_kiޠZ^lgBL+љfHIe$ C9bML%kjg~0, J.h$>&2%5y8J*JآBUd5[3'2UebG\{RDh1^h ׭eEķ/GnUP%NԤYMS#wf"xClwe<|fPkΥMĆ"ra#29@͸+d+v:r5X6QWs#6E PIEHGy`-o .g>TbV&'Iz lYˤHTT $[zi1Cӷ+~N+gHްց`c)e5lB,Yމ`J3js4I(utr-+\ʩ c EJ.1T^KBB֖cQT-벧4(>n{em[Ӯ/g] xݮSCiǞm&˘+/ Qyk"IJe !AmMob*7l|J`?x@xH+#(9容 t}ƚTLΒsb]h >&6m㊌ 췩Dhmtü"E_Xs:>?H_2,_2^r RK"O.fi6q IX}nzbV`PC H24fߓN+VVSN560gpMz1\L1q44BcjHn3|+4R^lѷVQq)ԢGt&kGT 5@XB09B%s0Ň"$4cr BfM7rfcdq`9dajܛ\P*+frbq,~U? "9 ;֮,)Y_5)2GmCc3VwUu=$(ĀkKK iXay a9F7^h,5+iV$ _Ȏe kClFe<;=~_Ɍ ݫ]>H[.=381'xG`)XU코?,q NZ*ʃlT,bp}'dkyk+2 h}eC[?d34#dĕ+-f<"02p%_ɋxcr+RKmTzkCr_[ 1.1=c΃l׮H_.N۲n-q0Șzik%9 jOI. -,PTJT9 ON(Hu^iƴ{i=P)md{|6F{P`6}h-[RA*R7p67-WYJUW}*,x^!WR*@p@D0hLj>dw_[z8ˣVX]z0Z[ S)G\R Is栲LV9tӀH>jnIо8oͿ'rsX;9aUq/&tUpy@*i++K=(ýD8 tVIՁPI4j Ng}dFH1xO@O2]7EDo{sC3"j2,w)(.C>U ,%g2'D}I R٬sk'ln2U6 "ЪRH۔93)ώ !ƻCLySbWn*=|{0JFa|LNY֡<%7| ,.pJiٜN!,US>/|.ȣ* +We2™.~BϢhF? %~Kx9n_,xj$0 !|PO~f_}"t%_p…_5U 'VjD߸@z!bCxCEMğ֏8\6{}k}BȆP%q=@],jfoBskJ8oWH zhIjqS[r9R׹3UU5[ntpKc;;L囖Ru7^tJK+K_lUVǩwn_t%r/inpS}1kO]fՒMC+]]5OOs|b혧}4՚[O>vӓÚ;̼3r??,f=mwws'-EV7_}bf:<;w'mm˚ dsYv;U_:z>|LMFZv(~]?i{Ow⿋Ypwt>>тF`ʈJ TU4 Ynnn}9F~n_}3&S)L"&KawbȊK 퇀JbJ4, N!% >XZzzKF֠p WRǔOrqH M0E)WGѳb荜!A<;JEOLTVtxڟNQqWsS^W.b^3ڊ0*q,;%84Mj'e-${)xLvP<=)buӪz4#$oV$2AAed$1uAIe5Zݠyz<8)|~/ps&!$ cBtr;sDo9(Hx^ jIrdg %Qe-ޭ| %|ܦhj#a8޿e"dj|#n5mVE])gXH ӟN{Ŕ&ۑ/ge3pleHߋZYu-&_S- ?E⥮&,&YxmzE1xVk1+gȅ`I*# =H: uBA'mw,0cI {#զG;qJd=ok\_V%T.zZhNT1# wѓ3_6dȱfTTmŴU\- jŷXn.K01 sT rUNmІsChB-HQ kf(bsbs~po/]؟'8<>EnI* MRp93 /oqښ: R*; Ixp0?H(c "jDQC_A:$ Ty)X<4"@ odU(ܡND{?wR ?‚臏e3|cY8x'Zϧs6Y>`lu`o]W&zӥeWT7}Z6K- ӓs׫04o4ƲE؊+XY~sw coom\)P'nK\fX=?h5xF|]M[x_q5}#.0I{k[I,|AF?7?[Yi7=)3<%:o-O[C얏86 OF8tv? )P "4~:w8gaH>g[ dǸddN΍xK 劶j}mjI]>2#s:X|\eےnA5+}.z)}Dbk/|xZ8z뭢Q?leٳ\, wfu/ٽ5qhx%l\㍯_iV~t9+Xmv.Xt2_࿮fT? G5(MD[x]oy(w}&36;xW"݃ќT{lZCt4xĖ $r=V>&~7b DB2D Y)+xeu 9 %;ĨP{uu1(f<\z]pkA,=43%U XFEFϘ,[p&E3BB.BdV{X{ӥE?@+C;{R#h:s+q__@ q}Ɋ.A37«p"؉^b>BǞ^j 3íJݵe)8}YZ㝍~Qϵjn 5bpK?F٪i~@%[3ԉ -sϹoxW||jO _,Ź  YZHYUz]эhCq]m:$?6' "3,zxhΒ{5IwB7b^w0eF{iљ=lj//`=WtKm|~-IR_Lmx/`qֱ %$Ԭ%DZ)@iv}Zq,d P!ZB]|6S5LNql$r2+if8EcRǾD('" eEs'13ckM,G+B;rq8Z4\lh޵ja!-`b=>8k.JKA%H|K22yk .?u%r1E=qůFz\A-/JwrWLj+xT +u,WCmCPj Wǃ+%6+hahu0 '7ݡ1 4`ry)^?cw1v'L#GYgwO]tQ5J@ 凋uwKZ汯o@o.ۛ><ͭ+  qow?mk?gn׳+|06 mBo40[5K}EU9 = Dmֳܼv9oaYѕg,Y{ 9Bc S$vȫj_9c8"y^>.mt}k߷k~U &O8֐UYyM.mɴU͉ԅf5s3/̓"'&B0U9!#tjض[rFonEU@RhQ oy'/rbno@{~=T>p{%vxЇFmISv^W/o'jlqWᄫ}*jCKPjt(!Xu1r,WP:JN:B\hN ﱛ\KPz*՝puBaeY { WC.Rp5ԾjTF>qv!,Wr!7,WCj wsJ ;$\ +Cn\ ۃwWC?q&. kXN18b Bm[S>WC'\#c}pw)Vr^[]C=PGf o{uӶxGphNU*g'>=ջU~䃿40Md~ipۃ7QfܯZ*Po񅬺:YlçPT1/?U~c6{-0@/j{͗]̔ = q } HǺ?0tV\~fy\rl kI޻YV_Se9{͞mYb'HU'9o7~F6u~ @]^.j^.%v=قjv,MbrL*2` lΐa ^5Kt9eڼ1ET7$\]u.r*~O;Smhd˝_hXJҺ3NhcA9$V jACܤZjS9Jk5DSYmlb|-QͨQs>ŐYIڱM]\j*z]nSid0Jhi/lQAɘS`͙bUԁ'"$5tnNJcV+Nj*uLC*)V>̀\6'glQ ϐXcgmEk;ɦ)i$MHHI? ҋISj#J$9TCJ!P`އƵ:U`j4&b>Y~ּX|)^TIj 醛W=u/5k0sS f,(2Xg!8dGhO͢RKͶ#oGu& eOhZE?%Vq$rN$`1ؽ:lDka*<玷:6848ˬap(3ѕ0d.ޖN05!1K`8.t-a6@*zE暋R.QG@P(k )lXkٌGaUhWEٻV(IhOA'WʔX]@1)-hPzpuTK Vje䢂l`.(\HqNxjQAQTߌM ]pl 5V:KH&bWH:zb>Xζr3.m%W@q3FN۠@ C;X#,L ,wL3fy]FݚtR,,8³N0t`a." b 5RHppg*KBh)7ud&e£Xie*3c( Ls$eIj B,#%l~GBYgwPS(H8#DWQv`J ɶR9'Y 1rǪ',$2%olʻ9&dY8"2LҨ 贉 sAՊhD3&Mԙ#aP˵m/fĥ&3f"c & kd- !0!bŐ@߯ ;OL*nnj\^ׂ[W=ׂY/@.jmFP 3 o f1P.Ms>o:@GqV2$]JUe ,#(y9AAEE ̄՜`$\Pd"UBu x5tqZ, h^= ŋ& Kt_v Um^n}[`q;28J[_NfuS YߙkyL#'¶VAb;>-Elm7|6y4h.V{*ce1)Vi,!zܥcn{1k"!uj Kb֎F}u ,,hLEQ6bpΚ%B-QRBEK0ٗda<-Lo@WX!-HpSbi6aV@23fmqtbl]NX=&_v!gѝmxdU#ygTf% m{@iRmުi^"If-w,a5>ZhB{ŧyvd fƅP[ #AtH 3?V7rsj|L{svyѶQ0p]=]asmU烀`0`\ &IQGc);w5)EIZZqZUU_Z`8FL-MJSpRhfQxk]YCm58EMyhjh9 7zSNg_~4G _և N-`BrH9]nxxhzD[9SKhLhzB #RT"VZ# R҃aF7lגaݢIQ!C|^wW؊vE1\ƒr<Hb&r 4A;yM ;bE 9RŢ4򨖌5p ճRuHc-q=B&mX)+gɈSZ]FύLWjmR#Ճ*\#|r\6fR,J@LTH#jJҸh 'k}P*tBe_j;J|U `i# 7APX?ܖ|F:dFjC^ԀD26.M0""9>dT=H=CB&a٦\3JR [1/LZ6ِCL*Vw:L.Z,¬"D.XrF.oIhklB:KQmtE Sq1yAA&B2빸qXs2D*uξS+;VD24EH ""H ""H ""H ""H ""H ""H ""H ""H ""H ""H ""H "^ z{[6մu?NL;y@,B! v~4eq5pY䱃KJc\z gEƏ(\`'hU3W۱fSWs,_dR )ȶw?6NHm\RMUژ0 B^PO$ kQmrWJwQGU,ǔb;kP)?"nK[_Y' t>52ԜRQ%9}ڒ'<-S[.& ԵU#ZFg3fNZw=*A] Mׂ}`W矿' ;֙; _PuaFg]wff 2Ő~qn1k5/W5]Y<#\/t9׫T6%NI V%yXǁvRubmA}2 lWoz'y[:+ryw@"펇+wD}]\ۿo(^\>/ʸh#uc7eŴ[$핪R腐^qoSqeղ#^+jRxu`hϷŵ1;oze:eW]/zwuAP-6?-׾F?®jQkn~l|MsXǟf_.ߕnޭc>}#zXOC]krwۼ'0dYMTBJW_']mU,/qXNߗ=k/˯{Z{ g~l}~;G'}s]ʯWM\/8l?Z_ĐR~<")2:]#fcu5P*\ %q׉EېĢѰ,]y'N[:eQz75˄-ш4N<9;Vum6ڨwh:}s׏wN7ZhSdQN7oovz3gՇ->SajB9_CU+9f\Jm΄3孷䭇>>0m9ݣl䌳]gzŹ"ezt1h#CBŵ,XQmzhAtb?0+Ps]5|- }keo.)s2PKJʸ̶6Yѷ-t܅JI"-A\ӊyqލz|wN6oIvOv ؋qnU1o lm0Pmv~ TRl՗8[1]M}1Yb.ޱٸu=ia:\{TZW+ഷ|+_g+{=w",< LT3_Xφ,sR:F>ŽYrn:WY[s gqikt>]ms, rYX&Ey'+ܚ_M&7G |,f_ѯ|a[;yd-ooOzIɟei7n?ՙ_f]mxں wᗍ-%ϔz!\iA8s -K2h(cLTg5v2–;&c>TE~qOUdXvD h⏃){X#ϣ2+K\ʢJ𼔍*B%&GSE g6* ڞH5W,>=c^+ۨUTX|*3>C;w:оLk 4CCqqOO;\ڨ֞\~#ZnBp6}SgУ5=dtNeT̖/*L)nC:jd5fj乃yTσNd͊QkleZ3TjLH{Y%հJY浗Z&mö -"X&Ip{Wu"Yr`?̜YW}nWś** _z xk&Zͽ %N^DRXЉhXehtKg QaVâ2D}.!dW "Ek)z)n+LjՌ$ʓ|t>]|`.DXS1f$,/Fsf**JM>#t U 8/Y(p 8euyȰw(ѼoR'n  by?*gKCi+0٩dAJ,s 1g4)<>q*#qۆmkpQcqAt>}z{,/Z#qhT#k#TS)XV8:"QMPm*pc]Z<|`/rf'q:F8O6\9V{;\w6J/$ 4d&M+DJ+s}RFŮ:\fc1pY+2}O:E㙊r-/b* /%&ʑ sdJT(uv:%dI(nJ0i{,Ef4˹dXì̋&[!Ut50DjjI ɿa_#:]`]Uc8g9C31wWǜ{9ɑ<o_.jqqvQI~C٠!^jÿ{F |ɍ8v|r NXZ-EJ{r!EJjD %nCfuuOUuUu=)P Ft??|UfhB}R"(jH +$$ᩆ_Sk,yZ/I-íXur 鎊xȾvEr^vښ6[A;z#HFoPs"X.>$D)ĀXX-Ea>Y&Q%l[ COGgCfWVk(T7R$#By=g!LxzCwԣ}BM=aqգ5ϨGkz{zfjV_?*]*B -.guу 6hTBb /EP2`ڴ /KϒЁZZ!tR+eBPg)po<"h%TIn<,"C{b3Lz\dz~,imFi~L2Պiٻwu=ǧ?gK:\Zx2F(E@*jdmΘ8¿{EO0 K՞"@pF(#Jܹ@ 2Z4x^:T8Q9*uU'ZI뷽FJ&fZ>s*%ՉFI]rgWYa Q59QE>3rgbD[5d)c,9}<Ĕv 1ErCWL!t\I|uO-x>w=gc;֤ZjhM ݘWmkE9␙|Gx[rt~M?i{O%=m:{w.AYE41ЬL|w* HTGKtƎƎֳYWs Whu{пJ50e$ D-qYlbH2Y''\@rڹ~M3K"`q K0n HZʱh-S*ꤐ]N`wfl)ncʹEpcp>Bm )${T)ZSEFD|-*7]iqZ/_L\#8\X{wez:WkwlLjQhL'Gr`4q{x`UHucb&ɤlʦvZx6f03}x6;fV'g_>x볝}NI$u )Y,~Ri'"y ޛDFH/,bHzJh4y8QAH'eZpD˃%[Gtyrr>zC(JX^Ϗ!aoWCBφkx{"Q/ojASC4% Dax$VTg9CC_k49iv,5 ĥ\b=<ךVKn@F 2M88\ ʹI$iuTC!2v:4'[C΅R ݳ[jl=nSE|#˒'/rWF[s}Px[ShT!&G!Z*nGʢg'~=+k/\MޙޏZ< {~D"ͅ r`!L58*NiN;=ꩵkIQ?\h"0>ŎrӠW-e Cg =@1%00"PIOx\{K F% R &駮(7h?(DU)V$oW10i-y BEI*H T#GC@q9@ Q \~ P<Ʒg.OHڡzVjrM(XQ/oa::1R"YCQwcFpΣ@h_A¿UYP<(#&~`~iy~'qΤ1āQ#"(7T(en$s)%e+I9ʿmz˜J: WJ|ev㌧e&(sRjV;rVo]?/K塨Z0Uj;:eۑg -浧1wj̖]I}pǖ 765I~L/?YeЭr հ C5D=&%K45ٗ7$XJoRoyIyt`nrSv$ZMp6c7-|=儨)?S8> N}"ɞ2C')dnPNɜ\h9ĕ%\Ic&h5 -b! w@R <֮э!OYNtYU񗙋FL:*{mdvRrRfZ0!O?n.j$kxk䗻^<鲃7TzOpM+7GͭkT3KT?2Eq:J9]W/62lP6/$ޒ! 7k9hZi\G=MdGVr, ƨǰ$^PV 6VNQq?4:\d%X$kQ&QVM;˘ *@世QF{oߤ<<`}rpo9[b0xɕ?>Fz7؋/=h~1_`(oSDBvQ?P{Ž*F]E:01)BNJ brs $4%!:oEg5E0\>R435`Uq9^FQN8e%#,O=+ӥ|%A1+ziw{b.̹[q_demRZA_}+}Uv6>TgYiL_fb]ˮw‡x<\O:Vdlq288!w>W[^еERz] }D(KQU=LKϪSx >+Y~Q_)~O? @1mn+ck >smvuӌ*ϽJ64n]gxg폕iT?[\_ V/jt4w׋iY 3`~1]ב] l._F;=!5ng׋5bԨ+z-߼ۣ6]]MjyvFcA@c[T ³49=6cQB8?D$GǨ*mGOF32;9>IEԖfX Wc[8@K>e^Pcp[*PHV'S.yg+') 7;Eѯkc Z^%0.G@Fյ5Gyd) ཋ*odͅ((M  K@LAsmX_!R}#`$E,qpWKk);NUC,8mp8t3U}A` ڙ;jn7E /(LvNv_&;_zSu;@ɛ&s |% ^6jDʼy-I;#NjW.v3%.)xXÄt# xhXAG`Xx]6+ɕN)2DdLhyxhId4T -vRHp+JSs.P4E =lE`aat39S+qoXUާHЙv ~ ~L\?8J1o6)FZ  A<#s ϴC{ v|mPG=]MdzrھWa<4,7ٯy\]j\Oj.ʽN[iAŧ@~g}U-~puNWmDv3c-=vv Aܔ3ZD*r) KI:M]IRysy z}N]3%vpap}s)WO9Q6:cO<YQ66:H6&zFYf獧zO N9&44%"iRH„`Դ*w:*Yqgih=p<&JZxD$xI-[Y\WknxW![,eEc_1-^ĜcV8C,q9);%Ol]&Hz=bEǴqx;!d4\(i,2Pv)'/B2/:A݃ApS*K+Kr%.=.r{wڊp2WWr5qGc+Q HZw\J\ aԔˍ}U\Zc+R`J[Ϊ V+ \+Rigqe,+ jAR "zHrwtc:ssTm2ܴeoǣo7fBYe N#J7h?bXiNr<c(ۻۢvm].&mAYGs?Maz;)vr,?X1GiiSJ7>O 3v:/P84wjIR4Dhr} /oɉEaUl =Vtq,BZKGsߝ_3W=`ũf;M^=*ڦ&>l JgAV%Amz>P0ӧ'ۺ25Ci 2\S"lTuRˏ|skV H)CT%E`Tjcyglu%d!`͢A:5(h«xV4@m=bP.jFBI-~Tav }~GY _6O62^,N/fm؁_w8K`T5%\Rt1*50`=1skJvrvlV~pJݳq܀];mW(J] Hҵkv*pu fM֕@\}N}>q؛ T+lW( WVT3qÌVa+"JWr3+"ZՂ+RkyqE*pu2\K+ v\\\ZP}!r'LMgMEP ս$\"@[zsV5FB93s *ܸ{w>N[^`YW&ׂiR+{ތT°4QU/1>K{%`o\m)W R힖nkNjy]K&9W$ԃ+kU-ⴃNWR :D\ )l'X2V H.ƺ"`+R}JJW(Xz+ \Z{\ʞM1 z\0lE"NW+˅Wֱ UW r"\) {]3r T Z+Rٳ^W5YW$بjpr [WĀĕQ%Rw+\PQXW֪ UZ\ ,X8eHc鎹¡*[MS1`B=3VqwLJ;,8HL[i66jת䋮2{*VNjpErU5"VW\k䀫ĕN*YP0zpErWvO!ķTi؀ĕ4Bh[PzpEr5W֚ UsW`ʺBVv+ky-¼"}#1Ep5U+ɵ :vR)̀ĕ\KW(q] Hn=cWV~Tj;qEIrQ\\w\J7 ",XYwrcIoU'\@ݢVMqv%ܴ]C]/enQS^哀$EոRfl 'F_&ם@<^.(3BjemaR`x9ayM$31ru'WR ׃+Ȫi/}ɵ{R l?T{6+pks瀋pEW(W1V H=HWg$)$ l5"ւ+RDqEy&puDNjpEru5 "\"7LfoڅZϳ4ѹS[Gxdq䃿8'4?l40mj4oϣ., #*P@9hXpP8P?l2'ov5>M㏳__eꉫ"8,^?"FiYw~f9gNb93ᕔQW3!W* =Eᚩ]I^JCl3 n'Cc3 CwrgWB|jbʧͅo7No)㗰dşb],a[Rt۱F*;OUl#gnƂnȿII[!șR3_NwS5%90mcHI;OdnxӅ]kO.Om_ZwBH@P;K0Ì@O94 AZa+p:R\H| \(@E"GZ F&K6bgr BZ\㝬S|ﶇڏ~iܕ,o 9܉c1_Q˳69+8姳ۓo4{OlKS111fc}W&=i|⧹1εFlgZɑ"̗m !{ |mʒNq)ZlĽgHVU*Vvi% TCDج~Y`!nLjr' 8h+85h4!2e\6{pPa&KDUauw9}~iZ6Kmv[lWC8CaH0%o'-LIV꾇eg*E-eK=a\m85>ׅƥ ;_ T%D%WܥLLv0ڻNңNRǢ/f&Y- mB& T4E@ Q 6L xXQs9W h-|˗' pփ˘f ,ET /Ҝ ls9C{dvkg{e5:)U/{j>˻]hys^M]׉Mc_o-ނ)n;;%M g3!WZmIptQ $w$xK "iHQD$`I)4x'Ќ䏝t g#.۴Z24:AG[#qFbq= 7f#qrvU_^EI‡mA<|Ƃ"XOd0kH$&h>)Vjf8G6AIx-l/QCNZX:]5ɥOwMI4"mK[cj #6 Cl+IxQq5>Xlq _W~<+?H"ThBe%0eQ:0Cְ/{i:&7|aY/ie1J59n %u4l1WFEL1Vz76(sePጭpqr 0Z{e)h'"b4mZ wmK^Mg-&x;MhCy5>ش<鸸lv{dR8mkHqܝ^iǛ퀴^ iYI?B]&AYzuie"!muSWPWN2 Hdldm]9Qcz̐rVB.eVJD{E锨G#,Jrk)X zq_j/M־'@i켟>^2fv]~z!vb lpplZSG ow?#OfZM3ֲ8:*LF<~_v-l]vnձ7}߰Sӿe:AG|EIs-9xkw\ZQvV9kD-0!?Fh7G'P~4BXڇ}5eƹ!g Y MY`g gw "Jd*Y[9 Uq🛏j{k%e@Vww1,Za9gؼ"עl` L3 _@L(*FŦԿ1jeDoяS;:'XXIAaS0"?pr[|٦۶3ԫfˍnn7Yle]ߩ]nRGy ENn.;w{͠M'GW;Vغsjp8Nm'ͺMyíݻ=4zk{=lvÇwc󨐻cߩWt^~ϝQH߲f]S'Eݵy魈%tz3?3%7tL&X7sL&WrLVL-3-3S"K]\o)W ^z,JWY%FT[ tLz~L|Γ\,89wF 9(,b\.Qt861B$Kij. 9P(g͠΢2/~gCb. qA(IdB*q7) Q A J.*bwЗQ 9>AuU%3,lq|Ҽ[3߆b_ 0DĒd"!ȉ$11%2:D3(BgX]AA|m"ך1#R R%H(UQ2 mdSDd$Ճ1Hq\xR(@`@|c񞥔x"E#ډa@i\N R_Lз}h"]G 1J[vcMDbw" J< phuh0ٯ\$A&>zF{eC+@G,(g9 ?&C!sw"@SȀ:={,Ly?코'0,fs:$G`UW47X^&VŗĂ"u]LK~u?]iY%F?sG^lzfq]5G1}t=?^?׃W.PPP,爧vxY]!:#u^S7) |a_dHvX]I/Z:`jtd IkrԺ13br:suunf[RtMX&g++rڶ.);KbmRf=ynEnr;tJy|.<$x<3My&f$4fn둼 y/Qo6 LRkNz?a=^٤Irf}ɼSօcDg"Jx ȩ^f6[i_]x VsN #DYBSSYw<M"Ra=x78=Ƒ__4N#߀Fs*zbQM8U+AHuiy%3 @lTb׈)J+ɜ?EdYl.AX r6OҹBQ 嘍ԯW'_48H9N{6]5;Tuo3mTj0JUq /; ͻxWO;Oc<-s놦Cf m:|SJo /y:YN+Ŷrvmſ9E?Nnn٭/})#J49囆[_pv? ` ;<{J~䈣c_8 !NъTD wJhH9GhBRH<Bޱo`'_a~ d=A,Ȣ[](D+vAK{c^&qPcRa7:G C}$hEF" /,Kϗ! ~~viar OgI]~`%YϔH Y]q\NSFA Ad"X QsN yov@̃]E:K-M(J]hQPrJ#9! peP"1X֍YOR| `b縋Z01EpWluVPA~~y֝ȷ<˵Yd'?$}a%FHߌӺ'?zkF4K>98tE8v筲Ίȯ['{Cg[19 ȊOUu:!|oŴkk ՌIqYutu7_ml`ߌN7?|7sM̝9Iι׸.zq 56wec͍?cgJИHt;Olm:6Z7V*-uoaa9Wt8ohε iyYGm<~luѯkwpIt/mE2E Sd}<|:gr] uݴڼn"msٞ>NԜç ^߹6]W2"byAPKBm1oKD| P0Xw3ݟֆNA)AI;w]C%2$Z"oX@S:P:q܅8=(TT@ Ap4h!CNN*0*3#B2^u@hcT218A :gDB0:QY-"?}Zwv<>MNHPIZOZT؜e: ]uQ[KIElP)gu{FAa9;+  gI-GI4dpZjug;qAXǭCsξ;Sdߺr]}bn8덢ue6 fvG+>ٵ--9iolOzdRy9IQ"{MkdN$ $N)I@F#LHW t !9G@ZTe(z2p, %au -%K6-ugdUj,e8ƒEئ7_Rwܞdr7񳸓`tO ~*o(u2rΕAI!pCcif.Y.T!nme< l!V"9Y\iSf7h0 QDZwKl7%hv-WjR[ >FBTڈU,!Jrr8A;G\ 1pIēD|t[-K\<4#+<ӆhFY4DyDQU"Diq΅IMRr KYθ [1;AK+Q-୰ĵ:֒'hyvў}Sgg,=^K-Qr*i  Y4}"*!~0\KjS哂RR9#$˙N%3⩣2$diǴ҄'V`lSRmLFqH e^O >"7<KدW'<[8_y`hN_]AZn9XъcAA@+qH ֥#"3CfemȳգK$ˋY$l&٪YU;#.|*x(<x,B+iQ߾BP^@"( X z/e_tA"UӬy_m *=xۡKSC/=t?Uv.#ŭ >NB`%,$RTuEi }/!'M K]Ӛ9Y 8ggum)H-& ߪ9[ZgN}V\?|@,f.崋Fྱr3n%ܪ Y7L@;^ n0N/(!=ʳ{n ;V(ٚ'wnR.XuPѲ`)~3xŽI]}e0N{zQ7|()^h1,u/dY?o7@\rwiըuҌQ&s3![֝U|7|s6sf j|?7.ﵝF۹v^vK d ( ) + 6vJrEY X*Z<|*D^;K =ή;MK2ZSj %S \-%P5S J'«I- ^{tu#9ʄnG%i+ҽu3RWLlU%WsQWZ*K<۫oG])]U|J 8JR􉪫JuՕ$9+&X@+uUɅQWc家Ҫ^]} RWL2+&=uU=}uU$ѫoQ]8]mWmlƏ'-˖7Ӣ|fO%Fb-?*&R0Y%Pj4" 9/-0F.[=y.U.R9efm _~Supz,A&|FXdΙN"gQsu2ܽ] |/\/66{o߽Ta5p7?[&D~mU8~潅<[@:O S.(#)ձ c̮@؇[>8ކ_Tj]hYSomJOWyO?w)mF @L>FV|P~\l"fQ||︙ M_:cש'hJbCQ&+kz0;h8!0vq``6$C H6-ZrîBTpjMuRǜ-݂lB 4Χ;֌>}m\O?/jfeܷ@Y%P+4@# Gw*yڤRpINA+Npo?ݜxb kwgy^O=ǽ}q&-qH;Cݴ]gk>n[i3=Fk΀t ˲ k礌Br$( e5S.!ga`"+\i+sː>g7!@}Nyp=6 -_W).[->~7~ W4zү*[諅7w_KtߞA]20:hk:mL5A]Cc$%4ڗf箟.~h?HtԙrY֓ ΐ!E PDȒ̎P`>^u;@fI_Khce *IiBoz8bW<::rժQl_ S"k4iQ'4{k/~P5[ =$6@ʩPQ^Gt ZC\I WRsTlB_B%3m5 Bbo ۂ"3X1G+I-BFheҋeGؐY:A+TElͭ%M&`w9[gLZxYgl(g\ l^QSYJ.M( ,$ goIy֤֋Od$SVqEKZ7Gh*eDЉ`O)`>J^`IyH˱Rjuj>NOaj90XtI:-l\s)6$ AQDJ:8T|]{щ8¤xb$:¤;yx3mТʔ2A ך֤S!%6ցį8(H)r^^ںˈ0mƙ፞u"; "oxI;ʨ!Rd5AH9&13;UN3hffҋGwq5qtT3U:d!$I5__KfvYT#QO }ɑMg;/6ik&Ū/*Eq @h|QF/'B_rmNe'ZRsH)hX c$&1iK49YBɗ"C[Ԅ)V`Ł".h٤,,ɒH$?튜 (k+A)eF6+CAb+#E ,2A0LdzvF57ƷANSob[sGJaE;#c&ؙd:y.XxN~~DnEc4P  KA-]'Xks ]؄| 8VKCNj`3֦G῔!%#3@b0FȘE52PHQ 8(&Mqdr*JH:aq.Crݦ-ͦP?󸇙q샹#sN9GO,@v|s݅D"kUZ,BS>:pԦft|:XH#=zhND/g”.q\*#vRꘌܝjÍG4-8 %Rs>| &K( DAul *DPTgM%峩3]chq°OIf(A,ӥHҠ&rVY >WqnI9L2&[MC ~T"&pL'гAk#;umF( ENU-?1gE9႓s uHч[Y"R"[${җI{`*Y窝_v%uPC*j]"wHa )^H*֯%7,1evCǒoMwB ~ëp5H M6ϣ+6:gҖ8 T-T[/qmOا+e4t}㲎M"Yֽy u2{Q:V?ؙ Ie0{-珣`?{cUS ;s>~=v?./(}εjz}ieU3YwVWi-ؿ3}WեFK2q5%?f&\gʢKSx[-{6u~U{fbOcWkS.vU֎AAz r*,]nx;ڦWr *Xo{gnݮQh2O?p>Tpƭ?v}{;Aka<yŚ7yVț-g<5tY/YZ;֜/~q.y Ax|'یt:53mGʴ!mٷԂ|}6MG\ΥRKԛJ7}MG `~}l֤\D`JV50RhD (r(:&P'~LNy>e!sY;X%+SMBQ[4 /py7ꊚj[/նKϮv))Yb!9P 3cQz'sT: \*/ތgCbx͊Aj_2R#HJ)Zz}.yAtfES -mbJ`sI&%X99EZB zr !U }T+٠3=k5sn {W5Uw~`͢Wg0)/xRD !k+$sbxU(\Πј$(gKzӑoߧ^T=ymؓV)%hJIٖ,Yc F?`)MvPy:<]ޣ>1Hq\YsMf2zy*:$]#CΩ,emc#+}H}w8/xM>ea=m}g%M5mڦ]3_U!8b(s͢o}ha?J #@Qiͮ'pᑸ;R !p.se Hs kMNȲ" ңHPp1W5f!sw' S g)d@\esP˄ZI?4]WQ: V}5:hr yra6f&Χ%pqxiӢ 7֙ZrQ?ѳChU|%;7ԏ`;WW0tz=*(cS;!|Rr>Ď}dgͳ]l!]7-~Gv$MG+Žfqm䆤]Cuwf6msoj띚ݺ67-nmoZ,'+׵+~H$e6oXd#9X/=#dWBlF7k7~ :_۟F?qs=o8g0~l!ezctc[ ,p KF+L*䢹O/^]|GSq7۩DOP]Nq.cpzBM.AƣaėwWVӷqH.?f0Y,]$]> lnw=B)9cT3tA{KKwE)6ZȚ̵K ǜo~y轾ɿ4Q]X=^Px $UjH po瓌7bt]e7 8nY6~x c2YmmwI5e%Xg:o}^{'O C27\߼6>/op8猪4WӭQR,R+ˁrF|<оˏjS'oXχihӔ_Ln$tVGV?]8Jo`M݃&#_?ow^l ǮwaHӖMlu#M|0؀3%'Fn .YU asatpHe :zOOɂgBJ{=$"ud/r:=2gR9(㮯m4\qW*zbQMr.dDIZ)AǤ?.6Ѫ|WF~k7Bw1ǢLG >XF4 Ep I oZ">"2gd\RV4!.X*7YRep 8sg6-6kkvr;J"SbC^wch@0M=i(Bu9h7-62fҘkŖMzYȟ`^Mn>ͫ|˒MU.jwM\cX/[e~.ZWj>fl}YՄUH0Rgo. @Ϫ_K` <{y|Gϗo1-:U,׽hJ'P#;AQkr9+x$ٲ7-3jPf7lPN٘]i9Mpgnsg0uwo(ԈQ.K6\]oq c,'`D0"],'#)2r g><=vV<,m`y}:7꫱3D5Ȓ鷿5e}_k:eohpuo]k7FE&@faFG6Gpu^8{py/@aOX<++h;qGhE*G@}n]4F%܆%JER` }lhvfU#w*XJV◕*I!0ruRɘ  Klr\˓^OҞh6ؠ`yKgO58~()`Јleu{ {͛,_a6ʛQ~=G]J?~;8 u`NcuE1Rĝb. $ԋ4%!:_6?b f#RD DjFJ3.>1 d0X,F.eFYdr{ ͸/d)O6Fl ?Y0 hbNxUN:@|_-zr&[Q[}ZO@Tvy^1N<ub{B_'z+?K~pAIz!u 1OF\[o"\j,^jvL,j %ՉKYJ^OFфUdtaYiD>LZU&gqE2 nԠ(,%!wQ|\ză ڲV@VBa @x\$P>y$qm6|` )-ؤN1&7e(z d#X,) !*ԔRBd,F㒱V)Wg+ eaY(NI˗g7dko6{jMoK#~j7no*3R%pΕED2@}3k*J!lRnx8 YCD,6A f툱 ª Ma]% ㊊y4Rw+UaNR`qه 6bPI5SI^`x_T& H*PCFdHjсFYB۝8$ڕNTa/Zgx,+uaO$KgI+kD4#+ӖhFX р(Tc(!J ԍArIiʑ@#@ 5i&4w|(]%bSyEcu}-,I.VUT-RNr 4[pÔ$e-0Z :E 'xx4w)9)+4 ?׆6{NYs ڸQi@w;y?~"G!-t1) ,s72& eTjf #γ$U ^\׹>G/7+GD98w 9[)%rx^&sA$pX%uQq.]am$)N$J")!d`$\ƘVTvX1rz}+m}׎.&_i A7D'M X I?L5LNzknEP{APz'pj)^@\n*e§G!7\iuӟf7Cz~)U\ly,9-ygP'Xъ\wFŧX o\vPnxc^:^P9c>7^=#(P2xLy&erít^]AEo3$9;9>i#nbhO*'SaK&d<10<.f5??kk5G?{W?EΥIS"W9Fl>+?ۧQL*>Jcnẘ 9R{<:k6{,u5`YϾ+^h +^{ߟǝ$Bnv훟m7, /{K\6ȆNݽWz^j>1^wUsL`1QU{'i+|{A]~y[S?HY5rF-$PQ/Қ'ЁFYT&|`Sb'˜ WAF2BTh$$u>i.*- KC$Ec#dHEKCiy{<1ֺZ7_^$`H B7{ȱ ̪/՗`$8A^$8_6TK$ɽ؁{DI$EI#ks9=Ꙫꯪ뢅K"(Bb!e2*ds}rHuIYK+ k'i(QFU*6F*!ꏋ6 EŇ$.H|L3-]AT+JD,ޕ *ʡ,]h:hT>Y],rv,im%0FKٳTMVRXА2v@26T3{ ݳ/u b3?x u<-CTɼ@f/QDc)El(5leFFdUjz2Dsjrjʃ(  1VAK%2pr>KNM-X)h`0NYC74!?ox\ k=]LLl}6!@L/M95kuݼh&WaDβL|}}}o kD]s6e W?>߼σ9 R"D--R@Ԋa*E Qɐ jUEп19]YSx:\Ox>YRۛ{%?u>}?JdJ"ZVd=zmƹDN2nK։Y['UV ZrGEd %aA&(]('JRVdg<,רkyrhgO*k =Ӈѧ͓fUnsvԢM[ZNo,ۮod*_vRN6~|.թ`/ofUJ"4[SGVEzī鑂96=,ڙ)ک%Q"ՖIDQLTd\!`Kx=ױNP@>5C̵ڬ  oxmd7S3ʼ )B(\)FK>@cL 2Eo@,{Dxky:Yl1ևM+nB7޾y#X+=FupMˈkm׺weWQ0S~`:/ff9~tsl&T*x'YY"zn+#JgG_UX+z~a &5+&hs4ꪒkԱJ󇮮*ثG]_ٖ}su(|ku8rV]=ZF]=Ja+x^]=K'mQW\Ǣ*/*{u +,xU%AݩJ*(zu +cz ͇hc!1kRiƉ ޜzR>}-T{7 Ic$ᔲ"N ER\P):1j"tl+P3WIb8x]1TwNK*:,=Y /Ed(J̪M㒃+Nnj'Ta8 q2f56G1<&W\__:~M["$ cP6{H.Z3n:smyk3P 7>K[w㔚Z_P3y1`] |xĬ=ѐ0 dJ^p1l*2Z:5f6a)xt'`I"qSCv,[t~Ս58m4f9IqHB=*Ϯ<Ȫ<6)֢ 6+e붖'w*G%E @$EC9S @G錾G"3ſ}< cq)AWXԒ0c% u4A uLŋthu!]EnK)yjCT.C}7OŃԬ :xJ-VZt| ^E(9Uz"؉˲3x~mQmA7Ͱ7z#Jb0þt0ӼQubrE  h kPrR}yv+oSwn:.dnAÏv[FF:5*7OֵzZO.^$ fI ֜O)"f+|9A*f4} ֶmPʐmݗݤWc?4|A3Zn0}7?V wm~n~]ƟG[0M q7m7a8em[|a5IOo 5xoԝ_hoUH덐W7@Z˳%S&Ķ kE-r@W*]_Zd k DZvt#JXޠ-:$O&#c:h=-q.1KBhD" [3YS>zD91k} ](e(1hp:팜=q,QJnU_պ$~'"CΑAxA*A% V*J%ES!hqǏ< yөYY9iZmPճ2+)'us:ץ~qr> 0$5MQ7 MhQk-Wp GWkpp7na=d"c%ߗ-ʍOM;[ (+Y*Nc(q~g {mfzHrP/_0SKۓ-9£aM ߯S˷TۮVxg}} fKIgsP[Yr7*kϩׄ#]IKNfc6j잺zYFCʻT[Wvut{z~^y8)jl'\%L,uiw={3 ~f"׭דm *G+5w7]\\;Q}=rpw3q{WMlg{5 T,.92 KvIw#s']1|n 6 L2gze#wl=]ZF/:6YfH}n'Z ɼS]saP!<@z3Z9 M^x>$Q p;=Q:)ܷdRyE' D6y6ydy)$*E"yTiBx~02 DJٮ:zfI1Z,:fBkRV/nz | c58 #^نq=A]\+?p=CZO2)ZIŎPps{Ǘ5%\|l`ղAY#UC$8p Z}9جw9ؼ/~N2F/(5uJ`R110!K$H>cNz@]˖kr\S@"d%vVߥ\nI]MK.z9! xh<4Y(X<0BdyueX1uz:SE+Ӵ}4zp_*󜭢VaZ,:Xo49T dN5xx{h_' @{[yzkk.l!y 2- d1 7,<7s-= K/_x|(|-qi* {z{1y'G%7ů?\*x?7Vq]ozJZH.Ie.#iVE9蒗;/]‹t#EʒDZ'jB:IB ~m&y=-_>E5>g{fh p ϳ`Yf;V/e]Folh:5s)/!5^K%Q%#p6x )#s+j6d˼d62ifj@Z˃ךAH, $5pŀeZkʻVHi(Jx,-}DG6>o-QD2rYdCyAM5rhzO!Xf>QһT,>qۡ[,?X~)y|6W+e}v=8Б XS:}j\Ve7mkwۺĐ/+,P,bÔ =K$Pё~з}Yѣ}Yя /Uc+`"zFXb22YG2VnC"$FKIK6M)z ༷ %b&\ G(Mkzpxk]\e1jjkE:C$ȓdihy&zd>y%uW'Ɍ%Jq&e2 |@$$ EfU0 -Z<,|zi^.vs/^j|rh{QicMeunF@L{ZDbt[2xͥlťR~hr{"d'ՓTމѥ9KgRd)9%]&%U[3V#aXTӅ8P]*B9£\7I.xKd[`",'qk9kl5`JBm 2kndLy#ތ0g'Ę[ZȒ{ ^Ԧ&h5Q-QCCIZC,x֮jm]YkAkvKS2I 8(0\{,5ff ޖ PzWՇYiG,dB Ɋ"51y"KDbǬ" dT'jF5WmbFjDSY#A#q綄Fz , X5έFojOzqj0'_g5.9T/z8ŝORt)&A ksd欕5Z " 2EpaЋЋч@}(X]}(̏: [G#Kiw2{>ȲG}e/wH$ܣd)E?RHS:Jyɋ!hBZ9LfqʡD 6&3x.TtG"P;J]FĊ\@"gN/oqF!z{Woޝr/¯e](9۰fO0ÿ+9b^nzt!ѦU4O{ |$\X({c5D#huM,~;xt+w8XvWZp"u:%\'N^'PMb-lKWjlT\ 6i7aS'bQ}M*v/U}&) h"7妦 o\-ݽͮ(㏜x%ncq >%bk5Vec-XWF$&bKS%}Îog["?`*p=`Ӎ֫?1FKm`U+D&V+W RA^ز[eҷM>E1)U*$Si#YƝ7v sd9<+ =O%-\K&"v 8O:|{7=x.e&paoonßr{x[͹XiJSR0P9b>XpU +*zvQwl~'G߭J۰%[L7K#P7cnѷswnc΍^9>Os=go\&! CI Ktr'u7g1ϴ m`ޭ{BX60Y5bHBc(.Bp)4\ }eo| ͪ#2+e^!z4P=^-q;H>zK+{0/ܑ$[.1P FZ(1k p8wu-~ȩ !Q[ښ+,@_t~LD<~06%ppƘC[n*XLU,ͼ;{:glHV%![8#ԥV(ô)JyaZGo<ޙw_뿽cKˌ>fβLT̹rW:wzKb]La?]+ﲬ܀<@@Yqqvrqy+$XAgdfPPoV%ӏ/'/9ãeAr<ts<^>qyzKYqmqr\7N+1 g=e0 ޡ/-o #օXMԤ4JҩP݇pϏx4^2dt۪o{J`/$d~x:^LQ'Gث3ձ&2KޕݓLֳMv }DWBzY!I C3ϭ2U$ ZeWQXQ{zp75x!!E` L {Qb}f}O0S6!f3}W `A5e촙ʲTVul}eYGy(ye٫YYBPO:&CW ]u:JofnJF+#^휮lYvLWvtZ@ ]miJ+MoŅ U|3b pSV̾UGLWH+J)3p ]u:J/3]B,M*`ӡ׻вNW5ҕhϓJXM:]uaAdgzteȭƣRSԾUGym䙮^]9#nR UK4hE2yd5ҕ7Ҕ LujZt(ejt%-^ܻ-; utf/w $Sb`k3߬Õ$$(ݵU3MO޾Qd잮,bnqZ;b EWv 3]=5m݄ 춃kT誣}UGiLW  d<Z2tQZ;++b_tt Uf*tj+xWHWLJ]0tT誣u{OW5ҕ1OEU' vd:J3]B"{+dʊUJO:Z}rjto9!`kdå0h'5ҕRAX=PolЁgqz嶤LX 1tf2IpG=Oڤ_MCJ^Ua97iʉ:G˓JmOI[5 nn;Ue't:õD^Mj vlbw6{Q9~RlworJj+/v7]mwדCkwk;~ϦŸ-t7 ، $ۀMk%QYUZN1*늳 ć~9ݫ E)$NE;݂69r}`H5P&RCSafmJڲd<%Cb}Js]ֆF{UNؚs͉h%9VWZ\ :EtNZɇ9hil Zv8$GA?SD&dLʦ VCHq 6;K }"jK566P*,I[ML9gU\:٨lU.CV \rVl,*HT5n! :*X{)xs"vTFEx.ːV660rLbSحPt8gCsmԠ,͊QCrT@V&[zUa$<΍!Ki÷`s=FeP o¤ChشL%eV2U8aFJҸ6k Ki)> *P&^\i%ՂRQ-7wcm'zM )0WK ƊU4+@&xσ6#Mpe#>J% d>^#T#] h\=ɋ`C%!YDrx , tXdunT% 9Uϳ75Vw*Vd%gF^V3Hl75j[W+;8|ڷ%}6B@Fx]oGW~ÆREv` . 0z{ln(R'RvՐ"e=hc9ltWW}US_5X(;KS{^5AJ.0_>6FE\H@7S!,C)W/a}Jh vscZAK^ /B!KKXC-DWhŊ5mmnҷyBPX΀%e#KrFAuj$fJ2RdAA~\ q@!"2Ϊ%w]jF" AV =kh\.ȱh-Кoڒm?0a%eH:J& h%aD[ L)Ŵң"*^Ex2%dU#>oKmZ0 z0dK-\5Xܽ3k BxXɰ{L;ϦumsE&hX0u1-mIX3 5z-L`w( N"7nӚg)Dڜ' &. {e4Iհ[f6Xdb7lJ5CےbSep#bs nJxv AA@jFK[={"2@ʭwK(d] 2BV')UK+ ` >C U #&5B) 2b!i(ٴac^;:@} kP?y.mDr1X(j b `NmU;3vn#nFO ڽz=X(m$TT/Q1WD Zvq̵ngi ?ThKBT|5у7[0%ֶHh*X[D0 JN0F6uM`+B{~Sp6rg(^cOYQ, )S\F*kn3۫ΦatIDl0rMwl@DʸInӅK)`JN- ^gR AxgJp.8! P GUB7mgZf6[,KڰnC@84Mhd55V?o}~ٻaB«!b?fӠ)___6C .!Ųn0 d4Xi4PEڼ~,oH? ǥNz 4Лc4a6_`|t|,(4ӘIek)4b2C=۽F߭헺ͮO>{[8/b!MƸ^AWx[ $}&e ,2?I?Rrd/O`UhU8G0 ԽaAZ D3^"xf1@b&3 L f1@b&3 L f1@b&3 L f1@b&3 L f1@b&3 L f1@b&3^.t@xdgA{T@vLT3L f1@b&3 L f1@b&3 L f1@b&3 L f1@b&3 L f1@b&3 L f1@b&e'&yL kBo@6gkc& dI!ԧw?3 L f1@b&3 L f1@b&3 L f1@b&3 L f1@b&3 L f1@b&3 L f1@_-H+GL l ĕ95")a&Kd) oL f1@b&3 L f1@b&3 L f1@b&3 L f1@b&3 L f1@b&3 L f1@b&a(w"|bz3xJM鮻]\*;Z˺AOO (XE'?%F$ O\>2q%-=y#sEsEsi0nHJ%+4G +tEjsEZ护 4WFGcH`ןlI\bHx], l^Fߧ?MwH\bH \QzgC쑹ޘ+UoA| 4WjMv=BW$3}1W֫x护2l^ T VEޮں7xT]R!ѯ[u&]u2aGVA矾fp6[,/- gE?nj8Zj+pO?MT}2$p#?5oCAf3tD!,}X:xj:u=nUFy:+ׯ6W9(K tړH)<Mo요cD/M ~.a\˦c^;;p6kc|2y`GG9] :;rV:ui#-뷗:֍ndZ-HilH5өZ'E{͕?v]i~o9񱰶U(m-sn*",ڤ yQOJRhUhԆֵ\|6l3CJAji^T_ƚn7S]*hX4_8q?7^;|gXdC\9B,0Կwo+ CZ_s$r Lv Oo}zx2{wcYtCY zמMm)9%4Z& "!ŜjmBeJCՁ=ܝ0܁!CcPPwuZ(oo֭z)z>:d;KL[ jq֖ êI ,?f/r6-UBG3UmWjjrv\L[mx#w#uXR?0ym-k$D&;ukf]و7Uܒ>[v)kt<mvUuf#] J'<,x:~3I;M:ưqIZ{vq-SUWNƩP ʿ 봜˯:/ Jv Mz%d'ǫҤc^(g|mu\ڬYYG 6t.{tyY7߯u,:k㻺>vyֳ =vEHmR.)a_R:g~6_1{<^j~3?c7%ItA`O;7#JMk^jmuz[5Z7וc@]+ ::Ge(\ps'Gu. 1hnsldB&Ujxh}>5>Ƶ9oWD.:z1:K(uOzxL}:W]ѭ˟Uv4S9=Z)8_oM \nKn2DW}[txOіOp۫I:B w1|A5|0!mGpKv/fJW}2:3;l;3I$$НHxho ?FXD* FM*!n؊ƹ`zwhKEhgbh)vQI‘kWm%<'oP6}hD7{CW]sNN'uu's/<α=%ܙ|Ǝ,ߝOZnRkV3ufM$<ڀ'qGe`zyv=ۘfXCC[:LTC$tY˥_j`jed›slgo.>6arCB'<=ITħp^zJa[硆: Q:+Y1Ƌ!B!ilq()ȡ%t&ft]N=SbXm\'o_tbZn#?穟i10æ4l&Ts6?O RώoM=8v֝9{&}=yD<-/zRvwvی?1}-M)Nx:=/FUd*QVH«ciQ>N5#ٛ85 +cpa炯ɍۆ[?~xN%WӔ泺_]:qd6r\݅C\UYxgBVE7Ԏ'#gBG6hۧ`|^j@t!i<6$e`_OpϥGBZ$m7[:r/.RmIw8P:OzPVt#;9` Oa]_H҈ ^ɓK(be.uҏ%8ڴ灋u@f~w3b _Vɝ~fE( xCEn CGRG#y1 ǫ3Rgp>{.>{>{n p=ܣ>8>K=f|M%Mfh|=41nUч{<^RI@8V)*vДf}ٻМޥIDܝѤV[4[Q=^HSզNbzNse*PNuUJV)ܑR[T RZeRyeHm I**%Qx8&nG/zQ&XheGjoRęO絉= 0vRoB;Wɾv$9ЎڽN 9]tugq7'w3&y8݉7^i}WpG2z:ZMyeHrO<9LKDvB'H:oQiR=FmVxSmP'nϠ6Lm3F#9Fac./zlJ ͕uJ(|搆+ bs$;u֤<#4'):QrWܶu}Id8luYÂm`uZO!O۔Vm$OFFї :=l#)mm+,"q"i?/x")g^21W$1W$ "i?tsER+Kd=*$q@'si8xsERj+_ ڗ"q}o!mIX\ssOl}+uoIB R\s[~a;6$_.R/ane9J$Q'Jd߯jDcJEJӥi@">yvU0ow_x{9ڹ,fr~銾 =XizO(?G5׼Ccwe徺<>VvqZn~=n^ @?^fTA"h]BYs&nN.~ZN)o&۟iV7is3#_?~9uj4+UڠY` E}Ǜ +\P}e*XZjr`òOFP2 U4^u3*uJpظ和Ә ZU#TufM4}ʛ '7mdҁkUYk,bq^|BtZ<3u@Rw]F܇BTʖy1-hܓ2WW1ZeptŴA+4X晟bǙ.d,& 8p#Bh&2[ttz(Q+ bteҪ* Gga++ Zt5] &FW>Hcrh:آ (IWM+ƅ(EWȓ>w]1EW#ԕNIcW+5NS1hC+E1A21*FP;(3H|X]1m*,dnVن QSܿ Mp7 p&ܧ=a/M:wMG EEV U3e9кJU".,v!/]Y_Wׯ-zHw {qK :w]1ef9ȋFWƆh ]IK]1ӺbZuŔF+ ( 88'FW]mTٷRcu$]p0Jt6uE,6VxD܈RtEdߺbJ닮F+)oNW,3ȸJƁI%R2v5F]y1 ]yt W<6]S:Wt5B]DY&X'*εyJ}c(M Fӌ+L@fJ(Qj:}vJ= IX.Rd%LSVa|^Н%/%kIii:μФEܨsHmI&1UaB":myf&b3.aڐ}(,yF]l3zdpxC* a%VQttzUZ]1SRtŴ+m|Փxc+q+>w]%Һat+vrZW;$Z?PDJ[ZWcb`p+5wlu HqALgi1+jrQAF9AuA6uŔefp!@tEh䴮׊i]1-d?Δ芮F1MAc8SQܽ;P8%HhqH4:})MP1`A8*9b\-fy,uI%NѕzC-5RCJO͈Qbf3&AWꡡ7ԛ HW+:AKӹ);kƣ+,+A7x)"ZTٷR lN1bܡgDh!+tjB XA"`+hq+jGmQ$*btEQS1kvFF)bZ )](AIb gpi-+mUOJnYI`U ?z$hI+Ag1|i ^LVqQFL Ж+j[Rd%-%'hI,PnT;;ǵ(Iӿ-I{$ Zt̸FBC L9, -HZ=Cog?) 8W=ϜFkZŗF&/]]٢^[m+q+Eaue, {/FWQ6TueAcӀ+uCF6w]1wEW#8m btŸQߺbJ[t5F]!$9A .J>}]1%Ģʡ ^ G btŸi1{]1EW#ԕGgAF+ƕӺbژ QZU:cU@o; ssYD5jnJ)wQ M- fJE#t6Z4@ N<te+|t=C3A5ҀкJ *v#(ufS +(zh 1(A.>):/EWL!w]ֶj2T+v(FW]1muE|ue=$`rtŸ&JB]WLjGs:ڝ;t4ZT))0P0E+E1bZsS1|lꊀd\#ffiڝFuuC4銀m33HŴjgJWǨAEoc&Z͕[536X6O(7]91A ;3 uh :Vgzf@7L (CR#(Kd4 J$Il*1VB!Z% -N㢕2>=Lˢ1-贓4yϬ]1wRtEVe?qÔeK3 =Ψŭ28# X癓px&4aꡡ7ʚ H$ߥ.vwiF+(HW qJj);G<]GWvV'FW btŴ>{]1etEW#tErZWbtŴ)C(!|RiA(EWDMb<+w|Ri.A j :N+ƕ3NubJ[t5J]''gUW:Nt;7q,jT|};P;iF9f\/LZ5͔M?Mw> D$`܇ゑ+T sS1 ʉj'EWLkS+jP 2.?KWDo]1ꊞNId`'uŸrtES1*O:s+ bƮ]WLm(u1会O 41(}jr&1=QRa41ɱҢM~yBӹ2Oz*ie\(J$I,t&-1 MR8-)$K r(DfߖZ&n\"(ehh~QBnC gL9qx]COܤuD@J4>/]]^{+2V]1^WLLue4Ժb(GW딘ꢫ1r |Yfs>x]۾P]}rS}C{ʽuӿQ;&g"ۣ\y#¹ɫEɂjA[gd}9l?K3}[lnz1/.Ju:l of:V^UG\ְvy.6C|3J*/FwI:_XpIT&wvs;_Is?T/n.tߘ7o>`^=Anawzknh>.ңb{:9/Dy~^}lv_WggzöY>{r?R36ète}_*HW$zxW[6&S鼦.*Q݊~oPrUhM~yp=ꐣXo55.~7R͹ /M}[sq z./YmNkpwSjVMuu*zdXmM'_pvW_ >}f{Y/.߿j+ۄRZ.Z f]W}m7ݼi|0Bm+rO/TEnVy# #OƏmߋu'tq/ (x6ӊu=iۦεGBP]"m4 u=pEO/FϛҌ>&G`i^լ6L产n+MF)OҎU UprK8N.v6&N!uy;n"C(CD_([#3;&b`y%9Iޣ߷gGkl勞OY ‹VGE;r\?}VгaY􅶻뫿~濾V$utK8o}Z 7rA >j|cfaob,]m V]oBMhΨ;钾Z4~ X7>uc ZՋc:zjeh\αǗe>= ee1e)[v5~$Cb4 $mG|_[!X9i+`2+I)t~Rb 듮Bσ.NrЅLlf"4(w:(g~XRd?xXJ vFDRv7Csۑ~ )nT(z:X[ZHr녷2 j[DSظظE瞭߶ nv}nKnڶM7vǗ'. #u[ysI[m -U>{ym5okk XKOiK8Tj",w*ߧyq#;-ycϪg:xk-=k:p7N>o˝>tb۩uwR.'mrseNɝL_LNG׳ɛI:ۧgwۡnݽ65T?j{UI,1*̛;4w%vWӪQRFnSM+vWȊM 2*(g=ZĴ3NJ‹rynf;)Z3΁FEryIA<>+HRd/}Kc;г >dݶK*Ltb& Rz1eC#j̰7T7 `{#IAQ ?F(w;(X ڔ}~ϖȻ՝VGZ=SCU#6h-TMy&K5ͼ][oH+^ffe $`wݙ}X7Hrt$'N[EYu]")L;lνΩB~䫓ѩ(w1t$69N9  y $-zKNK:t> ] }!W_{0!jwGb9ϥN8JEFN[1LgT+O?p90K~Hٕ{|=㻘F w0MpgrZ?9ENTgb,OdE|dLSݺXbX,,+6̷<⋈xU 5c?Kz)my+>pBZs:\!W}&'^Lɵ ƚ{bo5:A's(/SVm/ȩ7FL.x__;q.c͘R8 Z}?#(wi-D]ܪޕN+ux%Q>C| d##1"ZP4q"B2mHd!(#$qL!q! Q FubSij+8an`Uv ;*c.:ql6稶m'E;Yl%DVW|U5%?[:SMSKY*vbît'a{;Yqs<)NpQH\ӌ|y՗lyf>0#66)U7Z}j.L,<"ov#FVgst.J^}pYOcXoOfuW6=7`MJ% =ȭ)RrgOwdBoBt>2ZX5.H8ǾNYwR-/8])$u4tG7n|dsr]H@FNMP}gSw@ **itٽˑ'\<~a /vZn8h}-RۙA ךHD!JnQóh>a[_hj}UwH۝ #)/0F!̦]Tu2^i^"KDŪTGG[$?("~>1;JBd|QX,xc /fpc3yce<՟Ld28dj27N(Gע.8#'8U`¸F7r۷8`O:N#7T=̱`~|[mY'vǬYPfml@Jgw#¦O[3GPLF=]lߩ^Ys^8իY>2Cs!2a1۩$g߾Smy-f,5^+^Eh$3;CkP,~3љfy?8l Ξ̴8jއ/ 9Ke'pFENUt r_%& xFYCG:Dy,n1* ]qj%AwC |MiGgĆh.0yf;3N' )FD`藏_DYI45/.?Rرc2pS&z1nTO&>fld>1VI/$ ׻dyGuW0Z9:# VibvJ~3c)2TgZcX<95jB2 kAyeb (GKa 4;]."@LҬ\/ Fս[J't% LXb.MdNH~b{4M-݌+%2[y5Ygfx3Uz|\d>N RT\eF92Hw2p}  'V (MjR=,!qϽOtƘ"'(&D#Ea$ T}~ާ%xdNŰhr7VȦS h4^+x\2qv͗_fqQwφpØ;[<|&Mtm8탇v#16I|0n`MϢd b&W,$(M*Ezm|S:V<TL=B7 `1A^{jGlڵ,T-|{^ R8`=X}j{`v)ow{NQrRC(),c!ccLU lڣ2*Y} i J)R6':zJ/*6A4 4KW.84/|6sQ%[\jF3֖fsX5K&Sp:~KNqۨ&K+$`T jGC< r P4rݪAPN؆nƑt?v'iݏW6de -~``$u5\ UFtFXdͫ]-U>LJ%˻PC*WC*,Pf]:!o䩤[지`<5OCTEaN`Y@LnVUiN*J@ppio?!T4Zb)FyPHZY%1aiaaiej2^C`JRE߄a"h|T00jgT& 11Hi^qVӦA*ZX˗M{Yš<aDL\1؉< *0iWVx-n~6%`٨(I13Nk&xkYdN-$ zw`U6AV2HfN٭ !lFsx&oCuJ.qbBk~VVKÅMPZ*6Rv$3,T_,Xb5Ma'embPو\UD0Ŕ0 G%&cθ ȮyBg j- ׬0 ⶻ@dBKWs"KD(|f$i/IbjR q θHu:ՋFKpEݎQJ㒻GfR +ڀ&"l\ҋY'Stv:x2J}d jAY8l=-D|9@C"]NXPs(0K ;85R\ 3kR8VgKJ~,IT/.0ECW*HO<ɑ QǒPgD&8™Y@I@L?UAisێuk',/$a$Nx+!<?;,aB2YJM!PIIW%[~nJ=rҮkgOn'9E%&0FDpA` *3 Aṅ] ฼ &/x:sڥ Pes%Ud^Qkc/؏,guS9-@;3` [)_\Sw\%x&p67X1] bQDbp خGUG=1FLޠT4`bؘͻMx 8 $ոXkty_nw\= &[Ǽ]lQ̶oh6M"V/GdM^4R4hQ'j1K;PhPui$"[M1Eڠ,1M;}j.8R [:)򆣀_pg7H<O l:>~램y{ݔ49mb\->HjǶ,ykXe;^yA3jxNH(8kPSM%DQ7UD*'rj (MJ `bF`\;tz.)4y3vo'wmX'_ao3fi1XH$vlv=rn%ٔ(ʲFՕ<5ܮ1X7a+ֿw0Z}T$:0pG}dW疗{F^3荓.ތP>c4mJ\ޫ-9UTc88ngfwF3":c҂G:c/4%(qw|W8Ӱpm΍wcHrQqʞ ߇_e( }7,=k-zv!IHJr#8O:.W#:Ce=},jʰgS,Tz8Otl8\91o3/6zed%7GB 3xuYUO2XcCg0ZsWBj+8]|<ɬo|h}l+WOc{Fυ^~&!)P} dBݐ5zlLCك9Udoo[+-IdhԚp׵90E=Or"ed'6if|TV(ٞr) R˜$A|/ ;9=NeŠ.'RH*fEƃvC kB;Q)rɂ5_ָ |nkt!,z^m֚z|aky{ p*G)r=:$w(9/Bhx,ubx,F >(0)9?QwYƸ=n(/qgb3OwpE;&6&['6C##aB-ψǶ(U<_Gvtz!¿hMШ䡭1ɔ n^5o*j}gB BY - q!|gai<&3ec+$Q:@ _8zD>?#2)IZ;srvu}?7)(͑>@Nj/m CXo:/< R'59ɌLte aia8@L B"x H`$in UC׫M>[% O ℂ :AX a,$)*.ST2S*吡?n\^DL!,5a(OU`!MX2A55Ps|9B ☦ũOl'NWf%Ʌ_$I3jdzh>9ݡ-iD+28QsbG?FX N\@u|t9>rpEAtvƠlLs N0'6֪H Ao;/@Ev^nͧ_k+ĥcK`-,@"ׁD?8i<Ā#Y%&[e[%g~Kv{hwȠ7*+/>kyW`b/ V9xu,>Qvuc9A{6 p eTCPw^+BEP 9ٍGeۍ!u_;Z+0yqp39baL+lDM,2j8jheuL!:ubjF Ԁ,qg 3:D NΓ]Qh}H]KwyQ9sJFjcۻ7ۨo} uq}(jy[f dd~,brZH@^xKP}3O%"!ARq12)H|@52fԭ^Ulzc2%sPvt^2t2;uVt2V(~X \t A~o.ߍQxgf 7GLL)1|̧&`l +G9Q.{ecD&vPJP\u|0Q(;V̾|̸o g`$|- 'r7A)SWa$81I> (#nd-Cғ:HO(E$&JEo7#$z]xlgHc,HԻu^q)}Y.O z\Fzȸ89loYkdwj5`h@ʜaJOH X|{'y%. Zp[5N-(#뫚}O b `{'3RBB`@m!Uܠ6=B Tc&/bSAK-)̻.4;Lsx\iB-8kUfs16 L6lǬUMU]'"6(~Bb a.}72¯4Xo[K(嘪 (!>[#X+!AF 1MLeWu.f`(4}3 ǛQ3Н^DNeQYLP2]:c}FsϚ6{p8S($]4viɬ1LKh0Sډ`F©00Q eLCil'gt,^]/b y xZip̙?<,NeCA=?Z/eUûMw(z.Oprv]Gϒý 9ܴ̥bk+֒JA/4-HySҨ^]y(\Kּ.DG`1\DY7(l@vQ8M'Y pB&G97Nc+#); 0Th/WIϋ x qy1%Ř1nA.Ϥ ذZ(H8,B&DyL>dԸ̅y"l̋AM>N&}O`]ml_pˎ75D{Q7j`L9*_T$tp~;E]%^khti7FצH/ʑb ~[P6}:N_U}:WT^Z@k>;نGj/0Ę3tѩa21>9>PiC9J)v~FKse ,幍iQ*%/Fjt%6}t>ͽ6$'`qC#l_c/£Xɹ e|qWx.*_}QQi>7YLE4XhOnST%ƑB~s/7u=feL^pٵ߭ΜGmO" ċUD7| s? We:iMP,cq.Jց|ǩ& rre;51z\-"2 %2SFgQ *I3?u*-)E,p暃sT#)|3*,wԎƪ5j43ZݓqC002TOXگ>|:ێvSܵnl`ϼ` O~9r &۬oVq޽j!=Bo>-4uZ`"ݽ7Mм쟿o__t]=olud/Qw&{?u!z}F!lO$-۝~uXEl&fFBrޏD=VQZGfc9 ~1J^^pzћGtBBY+dۇ{^D_/He(Hq_VBgN[i\rF^| Bݯ5$ E\hI|̩6Yna 5.FƸyqbC#?EVTt6xr̃ͽ SJbkˮRq/s? 6wk ,j*55QT[wgfq)nyJT|-ݵIV(+aKc.b[ȸV.}cCw$Hm x`S"kD&҈t n #<w՝Ʌwx>x6s.wR-FPBZ_^),TZ9]bޭ͢4-Jϲo\6s㝃q+>`GE˃SȨ)u[C|LԜӽAf=Z|nso r}숋C)eq/e"ZaM6GHPR $t~9#N:dVt9--'=sj5hdm̀/c;6b8'IA.QO9Fhˎ.>f vMӹtxf8Ƣ?QV)e pU]Rd0)GR?X/LL(;΃Nb\9!:s:/q(R$I2S͇89 {T4lX]VUWu"T d5+('%!.C ?nvT&S#YGH!8Lig\ƃQ=SB@7. ӵZ8qڇ%1KNՃMiڼn|[`mt )Vk/L%wIߖu)+1F #r`F{5uF(";USl;^ʉZ3lX\(y0(P"So:0ASAk_8pr4`i~Xxpɷ1Rhߕ~qM}=.]k?J ?DT!S5tqW4i<'_/VEzҿRZHeS0.w3B|{nQTnՖΪbNx2`GO}ųi >9p?}22=aG2IQbۇavYlGC6tP`*"3;fFy婡S=8 sT0Um,ϝ{9cE?'` @m#90=Q>E5GZbS$vu 8%B^P V}|W&rlĊ(8eZ誰ޣin‡ /|PhLuhPQNGF:ri/wQڷ4FF l)%)-g폸pN3wT<V:uB)- ]n&a6et'"%f՟$icbw>4B_}4ZkXn2ϜcnT.@͸ÇKTx\-u`os؂,G)ߜPZ3&6{E"rӽ؁KLo'{횤mD zTߑǚ^Ύ6kZ#ۤosmK &A($;7h,fTb W&݀K&7:۟`J KG$;K)ΞGڂsa|0#FBE$Vi֡My1^lfT#?/~{aLC<16~PtR:ȿxű&?"]1!Ŝ.%֘I?g}-=0֘(y;:a1O4KI!EwrLBo$`T-["|pξ@qvכ)ji@OE^~ RD@u תJ\3s 2S |bl0-5-%݈"O7QYf~ Ge4F^5qȢ~2_.or\\^8% =W3Gr1oǩ-. z]_dWVnҹbGzOe%Wxr`%Cb4fڼҝ9h~tN6NSU^==(J3̖pKS1,PW<0KHk_-']p@o=P#KWD0վqL<wWVȡ]3:>Z`()PGU)ZLz #(T@bX\몔8E#ա&/Uz!][ڷ_=0ea)㡖w+xi&1JHǸhᆆǸ{o0#d^Gx^CB]Tks; E=&;`H9= 7Cb`!Qoxې 4i;2y# l!D3jZby9v ;~HJKf@0Ձ6w&6PYJ(%1NcsRj`.[QYjrynʮ:QU/Mq؟{("M`= rȤLi2^6zFqV eTBnE_Mmb4P<8˜9ҜyFhA;Q.Usפ.LBS}Jצg)[F=nFqF5r?H+DQ W۫¡eMj3h b7E Ńą m} DĤF!&NTLDRFdfQ N6()LMUG܍rX1> 40ByTkXBG{-G};l!#, I22 Rbq"D?E)ܦ^rjKN~j&`7[uq-x.>XxFkEZ\~sTJZ 2NNP75YXHtqDz ]Oykk ΃;@;;؜;kDC0n(R78˪<:w0^G_ˑZc4*q mÖL uF1X%6/ƈJC4C̦HgX!n9a >?! 焯A0^zchS#R$IniߣwQ ڃKk:ߙqD&@`q6q)$!:ؘX䉖2GYY.nujc9t)oK2%ƁE/Y"5ڔgDaĒo KVL=H N_rٜCߍ]Y1"YfBF݆`` S;".QPZ$&!I.CnT);8BBʜˎz[qo03"pE]2#?ul/q4SŠڗoQ03 l/|x(?s|7rs0q+5y*5K`)nZ7L$)IFR *4hѯOS\?.&7{9@F.Oe\>d|ݗEv ?SO&etU<-qt W"z3ΐq$K]QGRW>2-#8h ]"DeUZA .'܌r7$dž)& CpWו0 -zhӯV\!Pªv$H>ׄ,qş]egWYUVW,թ*NנKg4Jc{XBmuNSD3_̅aS@f%#eB O T:tOPWhĩ{ԆW'z_KҤQ%}$B3grw˳+Ga3 =0\59JW]qQC%1u'}M:6Jg9XAcVjuL@X|J)8'S v:bC("%(kʄF&Q1m z+R,MY3f p31\~wMu6qsI3)7Ղ vK)Su颽ްY~a#O߉^ .ve4gǻ8< &67~(~%;qK'nl<(o2z *q\:ԇ_C⒔H,iF9JF+WKny;#)mr.O\w>Ukd3Ws4g١8? ѕ1ɁyU*e cƀ":xJs b)&'d "-6r 5ՖޡVGo īo%u#)(*#b$U 3B%\L%_U ͶɶHH6ik͉[Mk550- m> ך&x9~- {33^Nu3^ޝϤ]v:<#h҇/Khf4X'_]ļQbY6Hnan-uŌ(n!NK 6wXGUOl1@SntӺHJahybN׋J0^M]FABz{E,NWMGO^Av荼3~L]6Lhݻ66!8 .Y^r#Iϼpݞl:&7_gbtIN>LG\Fi{x.l=B5/BĩBv4jBqOn'(%%(;ai~믓7f&pV[a-09sݰN}=5p/gZ-,v}9KQ+2"3hZs#+]O}[M뇓VIG wUыii%9H1[8Z#\xC'o衳te{3==Ys>} N62o3(K.D*)UK>hAaI[apʻ;.m? E))TQ?ccH%ո%0EEU.8Q83֖~m17etC/nƍ"퇴;r+~"$Gmƿ7-$$YN 49U +N=_olL/} ͚ JR7ŐܒnD8f'aW xuSZɸX4<̩/%Luz>=*iRJ gKd؞H&z&Ĭm`PfjH&K{AK8Igf6k1``2 ڢI ,&(#5e˗1ʔ ]1$7q N`R֯=, .pBqʈ~L琝-ݜ|EK˾-SQǘ[twc 7 %j.G'HTӈTO*y- CZ{یR$ =?Q p%P]p%ޣw<)e,J 8dbPa\Զ?nl M)Dt1T_'znȒe_]Kx__^jB0(Y` H \ɱ-4KǡEjyHͤFPBFrzCAzgMFz 0QC >?޵IPΛ +#K*^ +b|ȩI q8z{$Z]䟝N捣 {;)-Am[cg@ pPvΜqB9'dƳqsc 'x;.(b v\߇lwqZ,?F2ywmsr)[ha02BϱUU`D ]+E{*,kvHN/ĬÆo" (G}_ -4XTo&.A$00~BrǤV9K`BϹ=\`' D ^8Ι>+ ?B57}sc2Mo4m #xn!b¡> tr)Ze˾M+5`hc;h%1!Fojn!9sL|׭7欈g/xS"*NQjXs5u)1^-Q%8 -:DLoq9ߠח G%n)>$@E52y.ټQVq檇%a6ƶJq8hp )z{v}}*.8v%".i(}pͤa;vhe'캅sV^3Q(x)H;-/v)f6w܆l'> os!$xg hH ޾;HrWLT/w(LczkH9O]@֖B˨(r rӣ(P߸v=p3nZL⍑[hq14Rd(&V*AWz!7@T|wNCqݯ{j˨-q2hİC>LƛT!+Zv :R]Ipҏk|'aj)A[/e7"wTn]X+*tW Ok@(ƴQVA2^vxAc6?[RZxhJT/aJ"c}^Ĵ"rQMGP@`xʳd͒]CTeyVlW8*P# K֣2KI03CV;xy:Hq [16|W=DږyC+8 Hr R1+&FuB dcՀJeqn/Dcʲ˭ݓ&U> ,ږ'Bi<ٝ;,Hۻ܎}Pj* b&wnMu.rPp#PPxLMMNAaT JhbP 4S,7*Sܘ%oŒ+/}?k{+/0=י,LLεnI\nCI@nbL$c}]#!FahVQջ1'RܑeTߪ%h AOp7o zfX.XD IN @n(DpѾ"X0aU}|1PnE505?''/w?8J4V3-!_璓V'9@<# GmWԴÞ:jT8#+7uݺW381$aEn"ر De܆&;6tq6_|-4ٓpkʥɃw!EA<=De>s!{%_gf6k1;` wp} (O2tP>ףvIΟv7X6A>}`/*ekh..q%isFc>]ΪN e'ŖYېvW!XH  Y6C m3u]dਁ˃-\[`[&q FELwE l[|8Jƪwǂ&6װW}Q{{_^kvCBl꽧ZsMB:+{mf<>+v TJ)Ք%}A c*@閷㦇HTeiPBᾜ>g;,b/wKJMjfRI<`Q9/[h,0|Վ1|MMPb5ehAx0rĀcR*!0*&]A ,y:PD]8+&6>0O?\M < 5GÅNU1%ktiGCfOg!\-4AԎ}7̓˼DIdBPCRi BjBSYjOBDdW.BSՈ4KT ͔H}TR1#lq]YiNzB XZ A9t.g0{T @1|==HJS=@ΞdT^Ѡƶdg} D y KHB 䠽0iR$hFiCyŸ#H P%:˱1 o_7a̷˖?p*^ ;~oo +< 6%rtЩV.NL ~d`r:?}xJCV*+)K|<c%uPƄQT ~Fu5K(&#Xԙ%i#$CZ(]5u_~ǯv{|, EZJ:m*]q7_Gt4!AxANK Rp迎>_`~ IST9w-~{!+LTQQbDS`x;i4GVaS1 *ifɭ'JQQځN٪ 0LPu'jUl]=[<|+V#ad _j,JP&Y+PĔҲԕ3BZ'-]E4tp$\QNR\kťĭI+&)0;mruUU& *1{T["V6-=#^.dt:IIeY1@1K]}e^SW,TĠPؔ;Dcx-'l 1[bQ ɪ`eYxs*G%l0XKg:)e(*#|ppCt;pg]:h4t 6ܒ"_% <'濝m}s[$1(QO6&;~#΋v7| :~"8S_A+d',3KpmOÖ99zb&geA/LEQ! }"9Kī*>z0'G/w?h`t¡w!. eh Y2:@Ap"(h$߅仐M L0Zpd<%O I~r4}6 }{ԇ&/h3̽(aib*Y0 U]%l(|%p9`}:8nsOJFL 5%7y8$fJ$v H>J'']7rcRZ$s[yķ<(nN;&r&X`spJ)k.mJ&G_Qq~:; w?6_xjw;ٟ޼sמJ4ї؎s%1kk^5DdVJkڃ VltWXe3a_ծ ҁU|v| kIXɣ ϪR\#LJX-<~);m%p#vQ5ZXo DUKU򺘂3Jў{_?ճVh6鰴2&iY>|C?cՄ1(<D!u1:H)fWAD"35ui=ל1( n*10kPݒȽ~~\pV} Nk9 N1q%IƮs[c;cH2/|hś:_ q7_v4DN2]l>7~gO(V"j9 ̀,BҵzQZaJ81(6^u_6W_ǃ h\dXzW/|Alݪ蛻_6ܴM/޴7_;S\FYxm'AZ5`Ƴt}R5YJ솳 pXJ {q ppXzEcO,0"/ hyѴ%=s@z“G2| b"8:7+DTýABT٠INH=IOk0kn&Pl~3 XzUܬWaue{=4hh.|#F%(ktD }^›1F z(718kt|7ƴDSM{`5ܜyqʓۤ_t”fzY᮴Ґׯ)S0*614lfD{'4Z:񤅚B M fbWS+đfg q#?:Uubt01\zEp:~6z3ub2H$: q31||$ҚC1*͠^@=0&4=V܇o 'JvVM- 9_@VvSkFxN썽=Ǖ6S0$u뎽i{7O_d&Z˨&ܓI#]csMg_%#5;;KC^Ɔlo'SND*B p}X4K*RWz %6Bv\{l]_(kFê!<gs(j*HǨ]i6_QCw7kEIL1Ab^RRáonG&wj(gf֫Yn; ^|u{\lFá^u).4Ǭ\ J0GK=܆%Wq/XbRVH1Ҭ:27ZbY]:nLY:'Q20gʣ i)>3T1c%M!* PWCvجb {NAV }`|΋[~-ѵ>3LW7;HGzSܬf]=堝Y(f} }d;Yߔz3\fSo.Ny"F>z=3zxYmC^fRwm<i+cR2:.odbn Ƭ[O-nb{'m{+mDRWgx3ОU~IΩeȽה'΂#|t{;!\3 )]anmEKKs=-zHf+2dϡ3Usаgs;`O҃nB:dNy2lc}!k|ziԈ!q Fk Q>z\ ͽMHLJ՚ƃ = {m*0bjVaQB !xE6fyIe Y$ډtv^^zUks v7EZ+|ԐYeՉzyy\@ȈY~eۓ9 /%&у< }PoFc-ګT꛵edlҸk-jb#s D=ߛB_¤EƹunPO!Iī 'G+7qTѭ W8F՜{AP*4|JWjArk\f$sl˛:/1AUiWӍG6I=pN:Ƞ׻l&^з`{$RLu/Q Ӓ8URl?9V)}K1>$܋y$UR11 ~O>wNtMs,ܚRrm}n:zORt:`nRQfj kTEճu J) 6\&t{Ex6n%"U7~+2Tiy-4bKSG9A#$AZiR9"U8 l%?%6*mL/΂/zeK)J-w'E5#GUk5w~O/\Ds;@.(ξIt8hqCDEj+Esaа9 -jT-jR6W5.ᐞӟ𤺂IH,26<2o:'lG{VvХxUZNeL'PϮ[-4f aDPh,$j{b}-<2Yab|onw4 .{+M&돿&`mi7 Ź\Cw}B/=Ę|νI1B3~2_1̲IK ͮYKJ ",Z3&M0Γ*݇_Է6MRK:c`3`!R0nW` S#6 ẉwv~bX\w7eGǜkun٥'moD aZBJx sheX?S,VRt 6x!)ӏX0<)e6r'OOZ" &Wg)dPxFٚ"*%vPqt6\'$~W4Y F1 H5Dz$akI7h[ .9AI_Om,zd- n0(Wo y?D*b" O]:_ב\+&kLTZsQsW&ɷxr ^/k ţ'"P>M+!ZHR 2 V T;'l vu6 cy'DP? Nȑ%@0K{bމo{''yetdQ$]mNF86. 6* &Yn*&Y|hEGT2gSӾ|b^7"s{{(#!/3>܍8't{{H?i: K'b%ׂXZ†XmjzElVpEAxv΍ P)ͅMQs2T[$dLL Z`& ;;I6 @$+F$տ,ޜ(.o:\{3ہ3ɁXr~?z= ?}DZm/yX=>v?!hEpJj~XVgZc!_)"rtӕpIJB]׃ϗ_./މAv|q[C-0H4U7>Ӄ`ݥ?EGV].X\n0x1M[])F_tY`3F0Uh7iّu-kNwo6iX?H|5o v9i`9&,n7 #g_-'9kW0Ü'5ed޳x+{)9#cȸy0ku'znEfs-cgZq* A_/)gM.> Fc$fVqso~'t1!=t=Bz减gOxȧCmPyY^~TxZ\yrs4:߇?AW \5r-L#I_֎'(ϧ~)ȌݷbyYX`|Ƌxy-̇쁀{= â_˜ cN E*hE:DB3zEk}ཞINl?_ Ba&3 9<͆*`UVnTgx &ޤ͌S Nu-6;T&F;qS8aƩ?8NU:aL  {k?o#L%ԋ/=zzii) &R=*tmZkg͠t͋z>tejN=o \E)Fxq=;2܏,3WG~@H,v.N_幌:{P#hd|{g{gUB\ 5śSHuӀ ä]-C5XlX1h}4?/or8/::y9KY.-dzX~Rqo/YٽcsO4kb`xhd"yOwmT6n`3xmƽυt8kfsY]k\oEsc߼?:# kkvVF;u`d]nv>KkbZ\_ZA*-8>-Ln%B g0f"&!Ѿؘ%]fR急TZ:Zb$ ɷ#)^w)]ֹd90ȥX=*Q̢ŷV>b-7M? z}"5:)0ʻV/+&t;$ Ke:3پ <P/o6 pLg=da(?پ3ܛT=afoT}w&"B@eG `rh GPIjNVˮPSҒ+MϜb. =yM '#^`D7s`=uP0y."_ {SpJC MEH>{(j*,'yQI@4Mp4EG*?8kEo]7X~ڡJbF8#pok<8:\*|uєj zZ(YֽWp0$P,΁ۄD*npu{@(o4ѳW;Ðs@(bHӇhmkh=v13FiȾ Br.sȃ Y7V޸-ic qn1 7)`\o |uTg(%CL XFJkPt1Q^O#z~R5nPxojpޑ8+fnS8`nAhp ԿRZkp֩ j5JFm)-֜`{F1N#L(nq /:/^kI 9%<7Dwj l|4BU&X&QrFm(yQlmY7 .77}'˛XKV3xn ˢlؘ89 6G=?^Sj 1~kˊU9:"QuDzQ$cb)$Mߞjz> e K7&g"aVFQgj՜j~.V|"m1. O+Հio#?&Z2LX{[]2~TLFd?PB Gm I67S$29SHVHz%f%֨Xk~ԡ7ࢫ,vY'\FWi)s%"w@6Zy0ynd-gc'Nٰ(|7'`uє (e;k*ecrLXݧkw4xlHWe0@ys yG'69 ɪ}-VN|M//x=~4T.8FuLi]S+ࢩBRQdJ:,mOw Qry:|+ROi3=6ǀhXsw*g\3Qۛg60ؾ9nA{j);WY#B \7y,? }=_d/e?MR%bҶRD!=ea"XMa.ɗ0͘jPlU!WBhˢDE&Ĉ"&[6isrϝE4.QL-tվbТoطiekM?kk2~HD*ÉAP-ye9ZGcL/{?Cח!g3;:C7gpڅ#^KN~=^]jwE/y1"DC/wP TL,jo2AŬ0y75q/>60nGw!)VEy ¸lX8s UpEEZl0r"[X/W YTAS0CYz"^u\X,&0B-N1\e5 !} Ɯ58%,v͑@ȷR{D&"RI۷z˜s C/Џs6` h7Ň|i!#C^+56&v-r'Nt4"ouӤZeY>Q rn9 ~l[nǔӂz,"U789f(wem)rplػ =Vꊕ\6vuoxo#gEuZ}H>.D-i/;q\Llmo\{VBfO|_i&W%Dz} 5Vزn . Μۖ%S:أoee=C#Ό#%W#dPir,-ucRP&d@Nu/@b\|Qe芎_(*&_}ܼޓ5 Y[~-Z#O9>.!؊Ŏ~SSyE6ȺqؔU(Z1X:™nȋv00L0o }3#3rXպ_z1>CHv9$x@,SVb`9$oO3vWӓ*X6+A2dA^ SbKQ۽W򔺯?>51۔\H0U8X⸲#S#Fg_<7(-eu;r2'BMTK6*GЋ.Ƙ"ܤOgռc͈$8J0E{Bn_!#`fbR!\c-LGDŽfȋ01fh]<\n 915=8:-7 ˊgc|!~w!dzO!ZSb |h#*(ǵzamra.FBT{A.0_5 rϮ9z_rP2a~ZXGN>|ѹyOo %0%!/Ѝw{vמ*|p'&b-3aN[ol퐗ck%R2!;BNI 3lmTw'm߮Ͼ1~zj/0ŇNC&V[bRd6eqE^eBԽ8 U)\UކڸE.+E8 9E(pQezjpSTQ3|iA%jŨΫsH#Jen?rfyԌQМ VM=ԲR7KjFrM8/CYiT:uJCv *s^_D 5_1'BdtrZ/GU_Ddi0?_'񝺚lGO?|oH}p_}PE4ӛQ/H` %-|OGfişT-Т4&]7C^tEkO%,떐s"83Z Z+!N)UTJib[yщ[Igм|]&tZmB*Q0lw =gpjA)l]=EuY|\b26{qb^sۂY5E닍iStajﴫkC^HYV&X91'x :gUO))*bR+A jD,>DK-'$È٧K7Nah-J /W_9 !B>!wr[ݐnȋ\ 66*8NPyO9]yt#M P/P6rxUo ocXBU:pm~q#Xi\u;䅜 N (yP{'vcާ[dfLSJ?y.∂=] F,./MPp%l^7Vݡ>+7xUwn)'RN畓YK9'.Ps|(KP!VA} Nhq&:{tyU*?Sr{X1g1y Z4*WU^47Ds^?i~n% iqe9WvўjҮyM}{<5Kw\vdqug?[)EwG#=zG-xZ[$?ZG5#czm1P?@_[cޝ 7җ&U~qq#+?saf ҭAD}$8+z-xƆ2E'la>ztR%}>Y\bY\vϋ$5x(9ij$x2%F|GɛϘoW W_.yOG{!"R`J $yY((.?2\tquωyr'`z zZRWVB}@>L%y@d-a%)o*i bU٩0^˚?u[MIoFvXJk+&~a^/vj%0/stX$NxJo*\^5;r칄jMQـ-lx)Z4ZyރU?Y8zX\@x_'T]`kHtMH9ض*&@h׳q~B͍1!]WIR];}#rv"gI[26Qc`|5Ŏž-غ`r!:|qJ5˾tkZ@J`]jh,G0U%4ۜz5>@T\ TiJ@Ejzw)>⹛ՇɏvTJPEL:q.{djVG[ 4ɾΨDcC B-ϐ2Ԟ1:kLߨ#7׆&TKZnc6r2aYr.KlYžHpѽިT avKlW}~]j~ț ־_B%\*d?l"XCEJz &BBSvRY!BX,TpׄT(6uDpRӧ*\b'YVDi[ZՍAɁ#:oα]+x3d3ww-2^a8C#ㆠu}s5"JR+,Þ)OkjN% Pֵ*.4]mfAH&٪UsZ!?o% ݕU_%JcI6Z_HG6OV?~LxS="7|~7#^H!LSwޒrhh⎌SPdy@9Ɠ6":8t$9v!<&H/FlD*l-D#Q#=ޫKޓ5 *@&ojBlyK$EA!v.XWᜂZQ.S39hZTuCȜ׏W4V{)M4LƜ(8^-;?.ʰS%cvQʰ$.FN^QaΩ nlRmZ~ln+[uhz햆S.Ŧnnf-wk}c&vڪa$K~V}ӝEE-ύcp[a$;z ҠwJ<Z*KFLY4Ť9<: Vy+!b$C!*3>eqEC']d]* P14NIXTXctaЍxNGgКQlWd쎕 y1 ``uC8ԫ'ĭ&VuGsE"/Gw: Zo"VGȋw9o\[1\|>_˞iVӟ盩ߥɈ=DIyN#P* 4ezB-[FiRz rIqK`;qILO:\P ZB:)1]EWc>wC^DB/mĪj(ܞ-V !SNn*OxLn '|+%åꖐSR.8C8ڍS֌B&JԳAF |p"lt`11nzQ2^%Jr%O+8J8ӿi*,JR cUkQc!7('G)I H_]Dk3(JDƥ0bjpA#Y#T$KN2oCh\ @8KT*@hs#Ư/6xW?9ފoN۾)8 ßwrs#ql뗋j6"x%>@ĪMvbF\bK*Q+&CX5:/ {l+\A./;LA\vD8Jz`xViS' |柷Tz}jFwgg`:o[wF{vGm>]Zn,w [qіaߵYd^;_޻ ?eGG=ioG즤^fv=vl]GˌeICR>&WMJlM&%Ҭ$Dzջ%NmtFzG:uрhfр.mIzIc/!v-wy|'=$8 ^. 7H+VqD@\JQOK?K?RAD0S"BcP9M9΋,`>D ـ(7TnKJ;GF0}WCa,:xBd0|c*Ր=R nvdcT2Q*G$*(m(Ն%" i)X v )ؒ&5' W=T `Iv(XTK-sDg#:\a)#8c\悥yYcVֱiU|7Y )k\UU *%F@m3ՍZ];"@t͙sS T4jlQvJ ƭ V^{{0õ8${ABCL)̔1ͅl.,|dYr3-VOa@bh{O7.`>)"{(*( C.:YՂAP=iے9%N;h*,^VAEvszqh<16829>]k{wM$2%RrH^Ƨ ¬sBZNkw*o S}jZ>Μ~~ګ`[ݧW?] Q޽$f]ĥ R8%-ª3[Sk2x.>W$p| ()謎qY|ӨR`E!RuBixP~My]N'V+.T58ߌz]oB/VqfufVy/fTY㺹(HM6;Nq: EA49 T ր(}@ސ  L߆;QɃ[<1-m~Dr+ l/g}$O"]fƙn]Lq+сA1&ς&G "T*0++LYVqFG5?#htF5ua5Ϭ]yt/4/aƫ})ՉZD;Jp^:uNU~ ?=Ґy5D< b4֚Ym׳aiSDIjE<71㼱 ҅ڷޙ[E!QT]) @,6ҭ +Q9 %⁚^tta5͘qs] O` MUm*:NO4 jOP]zgHC(6?nڰS-|\ u >Okvʸ1j\;X8ǁRAPmIX+7Z+Y)9+Y|FIȥvyJPKHَ#!+[+t;<+&AH@Agҗ >dVw-`}O$@4*jѸi?cO_˫VZ^k02c)xj 89E VBN%'p ;-pG:Gf72萏w#;Q E /!R0=+PѢ@}&i%a>+(OEh{+ j [8qnk2GW~H:\_'Jʞ<}SPj@A(K@j`Ȅr J3×SPSNSPY+r1dëqjbLr5Aq'gcUxp؟߼y5ĔT^ɧ}jа. $š~5.F(#>ċvň]Dz_L2]}l^58o.tsҦ[L(1b Gi& "K*,RHYZyn8鏍CkI=uI]7֧hp^nhHRp|>J~>R7<nU1d3[0z8B^\:bL1ءt}$h3< ѨݹB$|<+BV`z4-zQ?S#rڃKkvN U46^ex@V)mfYN4&sgmΣCTAθ\T0Ph`#L7"_TLN'<%7<8lrf!'vKǕ4 Kx@ܛꯧdZ 3đf\ ?&ptu޾E/nĀ,9y~ee F"xU<āh?S?mD_^T@^XihFVDv5.Z( NlbGr{6iUU}j#j0H 5w@͆U+dNoS QRLIF ZRDG}pl]a `ecT̖%we 'Pj (%5! D&١# P:и IjZ*١qh #,uݘ:%lGw*fJI$2k|^ %%BDFab9Sk2x.n#/7TD(:D/>fP]ir>(uنjvnSkVUﳡ:k5>&/?@Nw(!$!:(*}vKepTY53S2ދP%8U{|!I\bs>t@h{~ǟOdsx 0 Gi,Plm`{ Ot[@hs+]`;m4F[.}t*|SCI>ڭ'›\Z\\J.k|2Q3ζ.zQk^mN3Oi7/~9={?_¥oQuou96t2 mIQbR1uY1F@̣ras2I9bT`kI/ m/df n^ʑ-ǒONbֺl[3>RM~U,U&aկ5?_ڼ!`v<~{/5#q֧nUB{.V}n0E!Le 1{qutj&/eq|5_<Jx|1f/F|x%E?/qwDf]RYG~|n{FZr0bl ċ fM;kFOV`z<9lN5^Uܢ`t^X,RZ)U&]ABQb= YX .UUX_6QFq~hv-~7 7q럄mCլ+NeVphFÿˢW5^2{Vǻ8)k|8z;p4-~ 3\rܪc>V鉲R);i@fو^Pm M]A= p,m]jKrTVx0׆6\{Gjtے-~KOZl7MmMh1t€b]#˱Y/HHCX*|w![/{`NU=l$_a4o X jr;_ K `Tb.JE D ZJ7Շc˘Q f!(x  BD*ļ1,AP. OWJ PDI{5) MM)Ejxn* 3RK*f(&y@s'|1`50J`$X]eN/'/6"lnga?zO},:?.U+~?nR{n]c*=XRZi8*a0#L^Eͭb& +FcXV gwyp\%o>1Ss{w\I 9CGAZDP"t$F9=@`}cha`oјtmqX֢ %6Ȉ ' ltpj!abPع(LX㽣>+k6V„fā"!ـx([ :ԯG aҞovIϛy6U;[wg bZ?EC_bR"–CzEY9WvZRi}y*k2QtV?Fp|you_7#3{f7i%q~ ݮ[%KWsM{sVW_'i$g2z^)A6z?SAR#^w<`c#mGW Ph'] ]`s*O- ݇y5v~j;:9m4v߮l:Xbulyȃz퓆 62yn4|6s{'"KKp: /V韩h ^Q+K)q( TvWu}׆9M,y(jpowo۳1g3v;`~H#ޡˆ~*wkX=[6DcÊQLǤtt({`p[4ps#=Ož_^\}4W]P_HUN Ww 1Lɋ\cE,6^QJb"p}DVK=L u`d`DJ-6(j:"|ؠn129FaL4X(M4ZG"$D"B!3-- #&26[J,P#AWQź[tFޛ YעB@dC]ŸeZ9Zjft,jO\}0ӢἂicVu,IR3p1<1&Zĸ&/c1B`/q(c4<'Hgwpd[3,j  t95Gϓ "4*E92ቻ$7< <&1G+@)=uqOe* ř+ºG&U !{K2((j\WIdȮ{{,J!XQn#* t*pE영YLƓ#ْ%Zj$T,sdbH.V}-z &56GP0"Ni\&^LT=)|z wtͺ [RIvt5'脉$V*b/1Trru5an>BcUKb;|G!,X;r JFQV v SN",T/l8.i>3 ¡S>ѤN 8JpA55s=oJ;;x&wU'Wu묤Jn!BrM" L5Du}:Iok>V}$ Ύvv@‰¯ެorۻFW.ql]%W=㝍f4FbWƇs@oƊ׌FP0-yn`Kr^p>"awH)2)+E&Eȭ u.E>;dd>֚*%V!xamOci]$WQK.V J",O:0%-`|wK"N-W$ T$35M*$ڊuExR uPx0I$ $Fd"k6M&VQڎ'+`ǥG #tjH#${wAFw9]s*9W f︼{vxsKъUu Q,J2=h ?l 8=Zk<=zJ|Q(?F㽻ď9=ZUVThw@C V>Yհ[7n]Q?voGZ+]A6o6dWN!0swT-hǞCu ;8T7teꝰcFV_Iaꗘ(G`G4ކcC F6TE(b~vN[A6]lH7 zQoC aPJH\22 $ ,2/ϼPP9Ά=h :6蚊zBD ;=یUH)}L8 b3 TL<63GLTKi!=pUdY_Ih A]~Jq.؞Hd4zMIz^$@;1[`1߻],k0\=mY6!6]j5ZM$J9F<]by\W8SQj1D6Tj}Y?v <^y'9D~lkZyzNn6(MHiНf; K4^#mҜ@,<NC &!h46Ѷ>jAE* 8%@\xz]K&>z5W)}<}@. Cz i:(m94h"^@^^MP!DWEXz훋ɀ1\4‡i~4q)#'`mC>wi^PHGĈ`4LTBie(PWà(1:֑ЊG{5XnҾuk{+FŽ^ CU,bËWar ]F ^ݒc%A@.s0.Z&o9 9SqY}bNGwe7oVBۤUO>:r یG9[Ү̱ ꀙc]:,NKxeU[JMK{z\!zs<[lڍȌ`8OXbZ 6i%&{<Zf3VXp<,'EbYub3Z+]$Kŧ3MT$k熵:B܅Hk{Abȏf7)-_􆆂ȍ`?ma*t:[/~f?e+޶9?+Z٬rճٿyGg]woOA/x^c |7?5,?5ݱCod+/l^ۗίE C` ~+E@^z4e4/cQuR8}7K_QxStח_G[OPsm5PؐQ~:7%t&Jn,#;[Ȼx~:IznytNm+7~wiۋ'l-k7^7<} oA>ck·(fW۷g _^x3@W|i/_uӯGYѪY; #^BϹ4K3:Ƴ^oeY .+گt\4?S~uFʥFgp\$2Skt @StSq X%rezw3-.-m&eɺY׋9+l3 Zz\ ؚ? S$K4PZ@ N9JҔS~B;{` ϗp 0TJfΗ" η8IҮfN[uw4o3G.P/$$'N֜1\GX~OΓv*OiA_Vd;ihe˵18N2žW7b$3*z, VڑwF hiY=MԎal7gQWV&D](5-bGQM9JTߙI+̲ҧ!3 J jJEP,,EC6a$!P'A/[a JQ}OANFTu49cw#98pjFka !N/S6iMj5y'EmhQNkla|떑?̸"I~iXČbXD@Ȩ B`a48UN9;1ְmHRď.b!(;Fqڃb6 t[Q$.9)U>dWpIU0ei .9LzG~&K)+6&Rae]$aަs0PJ0=ܡd3xp4qI·hXgGcps8j|͂aٵV2HSNx0Wzms*~X/h4o?/mEdsFÇafއYu`!bU cSQ>E}%|ԟ֛gu:\ )BQX:& 01J}_EKQ9׍[P횳.D0Jk~4֞?zce=ѝ?>d@Dh,@ƠT*򑰞?揹, 9 5d ?\|a",0^ l JsFX]éJPIpjGKʏшL~DGT~DoDU՗Ɏ1fSɃR0 muc >XcR>ُy,;cTS3:Uc3:y&d &y}yꙤgIV^?1cFUhpJhD1eqy#P*NXiE=tʳ...ڄI.Ɂ2:G[禛ٮHjJ];P "*e'gZv}L(|4&ל%9J!l_֎Ǔ S$$KWl>B7ĄAIHC!V<߮UȥMJ.O޺I鸖k"􃥝qR,hy&JOhsJ*g*O1d31_E;}mV)0yL'd;t'NK"Rox4[[)% VT6jWӎ`kjWU= D?:$"F"eāJq""`u"jT S,y]S+/Md#RλA%\sI%kp8\sI%+q(..$4$PpP&iT<5.)5\2WtЗOY *&鹤5dT蹤璕E}A%E\$0ˀ4HЂPa>-*ɉ=3,ם%tQ+)wNJb7^ zikǔ?o/N:n2sBxCDH2 b |C(Ƒ!5XlNbTDB3-{ 5E@ ڤt,?lpJ` f7Y8?;W\gz dAշB*o2W4H;"*e je7q7΂$4}s]~`ss$5Y>,/ og'9Z_,o/?3wP+'5l kM gvW#&/ K2IE*? wO9&Ԥ\%ڄ< =ek$X$M#@*a(hi[S ;{l`6ql:w9Sk1O$gkreJ/{nuYD>IRs2t9u_/$hR,\o /Њ>?)žvaOJ']OJ++q,!TSBQF9YG"Ɖ$BJCgEoFr,XNo}tl>ihP] .> (([h"A4 #5PM"LEFH&ZsAḘkk*u4:6'%%%Ah̤%)!T"cDKJiDI:24ȘXh͙J JMM#ѩ\Ee M sd8Q*kqa+ީ]+)|l_w6:?b`p!o-d+'62l3v:@y[ +&%YkܖT@r3%7b赣6~E70*0U-d6n朔w\oM7/Te) Ăȸ['VV#|Gw=d]{'w1C;EJ g?br_)x &f=4^O}72q!Uv,u+Spzӛ^vo;Zr*k7#MdI7C $=Flp`#y6bsC!j|v:o/}jwê cK6Q#+~ PE nCr9'FO?V(j} {IJClKpW议:ݧg/md/nWJ .R\)kPMWBaO68󸞊LU6<ޟi+OL~=udHVVϜmXWAȔn*]+3?]#$駛ﶣl—G$rL+M"ps%#,,';j᭔Zk;N&~ 7[!%~գ}xM"Ǜ!SLQLilzba%+FV!wUp)jxejZ3O8vF*P]i9Ha}O'm>jވ|ID&TͦeK+Lax1K L`S{'SwNFAglnԞG@8.kTG/kϐ`8SRoB#쌓fT=¯hznXL!M|)0|hh,P6ST1{;'t$'Gr*JK (JIYNGӆ3Tӥ9MB3Ǐz%PǸg41Q$xl@#De*CX5\y5Dn"cwз^@_^@_(T=ɨiƚwlf"J NXlp <ݤ$-+O&U=~ZKdF *!m,,)f %l.PPJda "pF5aE훛^tʚ]^Lϟ7]UnWezE "\kXVVۿU>B-xY@)A Œ9IXpKΈə:").ZA ԢŇĐ4~0tay?-n؃1pPpX€Y0Y͙P8fMe0TxJ͆tg%TU*|RK%7o#736j7? jY\.E75bv#~1+RMDS.&"ƈiJIfE9x~[+GJysE2R4Na eV}SB "Xdʲ#[4\b5ŸjKE ?N#!eA+ AU(ʀ& .!D*dK, T\+\ ?p[KIx1=Ѩ>s C.iOB]Ӛt(.Jj5\(!F*r"tK8K^Q.S_P{0"d(xS)}aB{pQvF59ޔt mYҘ6\;m(1~ E?J/mWl8i4ەƜp0jjL34*(pB`F(i@Zl01 i! `{P:z縣0,䮯jl$Fu/d-mO'b$ p8KWDm$ņn('v7b$t@9m`QM04J7F4hb) SQѐ,JKELk)8R1AwUW{Mư^颈`u,=3aI@*d7Lg`{RQ1z O&[x:tJjԝ9C]w*D] iIPW#NZAtcĕI M, @uPQo#Wh]HΚ`"rRhK"8(7j'HB#sFj& ,E3Iv YEŖf;2r阏c6ƤrM,9N1hBYFQB'րWAl xGXhV)-m@( 1^uQq(r$m[.~e08!ʠ30Zj?̐t 3,9 >Im`0#12ňrfab[NMe JO31EocE}n\cir9A FѪn$zT5=T׈ Ryc{¶Q~oM0t֊X/ =AM H5/Ce@i yh3t$N)=|M͕u.r䁠RJow<īWz!Q{&-B֘[ :TM$5klYPhijĨjDrA6ghkީ) {6<$Q%o<&[4E62sScn%Fh:Eѣq0^t&4= Q:$k,zOּA?HQz)GDhn71CyD6SܒSͭ@gjB'<*yQ5V6&'dDm=|'F[Z!2C\(,z/ Fk\ ]JP X%+L(s/ Ha!jβ32r܁!7-]Y*`.]gm1S(z)_ . LsCKw2?0Lg@U/fKWo{'H/v_ f@Ўƞ3Іx#)o˖CAs <`9Ѕ%^j>?&!hUцx8 SEh.`9hBu'sZBw-(> _ F R2,# z- 6;)♵N8It 6t?Sq %=.֧Nm fM̘r@TTD:YڐYP#h6d]SDŽ~} b[R:p og5ѡ+g |.'离FyN/-5‹&l{"{!9_W.6_^]؋O $TaauE4X.>mǸw//[g'j߱Q80O$o:HCQ޽Hn%s `=W+Q)gC͓"EU;vMrS TWz Q V~yJBfW#Z2G&{Z"|'!?\5{!ϟ#DGƘ2P)L5.ݧT[yn' 3oH'C/;'Ňn뻩t??.b=ٳUFd3׼0#{cW]pi~Ȍ 1׏d+nAoWr,EF WA1:*!N ǸQΏ[e7$2 pOPo]])dݭw2@%o]ׅ]>\֓>|FS}\VK@6֕+OZ94lrT\  ӤI(u04B_jns G@0JU, |bSvזPmlwݑ݉[M^>>o*bݻŽK?^@·.n)_-wԚfoI@]}v5iB]xX}1XNg~7!L s'ijkg),+\-t$!߸)=d%gDi;F6wgl0־{vBBqm#S{ڍ0$&ܭ6/R_j] 6]V3IPK @2x:kH6X^DtsשeZǍF`/dVC'h4Tj^͆dDӐ 'OISgMP6\~:K,5P_0&-(SpW$%=|=S:SFL*1yf4⁒YB?OEYzfƑ1 \yDA}@n~3j2 > nC}Y!.CZgwIȐg75]Q29/q9m9}ф_B&l~i R3 c&+�U {b^'Mk p){IXv,Tq2U!ɳ_Q^KfNٻ>7>u{5էqDZO i#9:iW5-wK:LhߝM; iyätK&dk\ <bx 7! ӂ Oz2*7Հ ;(13A.}lxqZ q;RWnqjdBжWl;F #n!O+~:hh 2h w{Oc5lI$\D*9|ܞQTW[Qpp>|h x-) y>&ǍH_vv}@36; e?yJ\^-[6odMfUep,,3⋌##2bgD%=>6caTa`Os^o62ZsJZ5#޼}4FƟQC= -PIq,PJF@霈=~6<˿{PꓰۄфV!a2IvY.iNHJm%BtUHJɬW;&e]@Di'ӯed(g{YS}F0+ɦwȆnu&H턽-NjvEu9[ө ߤ)/ }W * w"&Ql!3ք7r~6C׫L-Ά3(صm'ZHdᇹuy֪um]'SaG2(`-ut Ab+hd7 iM@uy,xztO9 aͻQTXgwg^זWqه!cw2V{|!c-KEO*1{pe"dln0%7$-kQ'۔qoᮟX޼v"y[C7o_mn@эHdqo<ڙ3_]dsJŘww|}t;j2ں^QFJ9ͺ UxKnrJ'vVgŷR\򷽀~"X!8{}(~M|AEA1BC$2nP[̇RpP1Pz;k9@ݼ6]ܶ_"'Tw\*~ ^wc~ M7Q^a̹W#w/ bD.RܢC. RIʷ~*xwZҥ)pQL<4TCY1Lf53UG[tRcҨv, hnO~_5 ÉcD{x 4xE.УO hFr蒾dFRϪeJu~΋j6z&[_vRsJ&>oClU|'c>cW`2?iS)&Q"zWgPrFzAOm>>zBF)kUF1&Nݵu$Б[Fwٵxޙs8~ؕvulM'Z#тJ5l waqKTz2!\{qDTZzkQ{)FO\ ꃂQ)Dk߆]'j-oB&j`Tk ^]IE2;s/-XPן7iG5.n 5NLG^ϢZBM 5G}jl$յ]4AЌ\=Q\W Gc ko0Z$}Ĥ/`2FH,iQ2:/N`lrG|$B2yfL|xrXPU7sgW 6s=\si+瑶 ?RǛҔW5  h.}Ys6P: XS k 0cH&[=wN ,Oti,V"|kZlMf6HbʂBARę'ttA# Z9-3ZFj:$d,W+FzRҌo|BKCjI8dA싅; H0& ieDk/6 g X%xFi^jy0+WMdрi՜=L>C`ɫž[̊/Nd"w6X̗Y Ϛ˃*ss9a#(6_'Eh 7a1'q5x@'GMlt4 /܀|on!= QJC-į hL)|^G&ryPEtrvIδ[6U[ǹݤg~Gg!NeZU!!qm,SHyڳEd2 :划.z#,ݤ#=.c%WhPҗ&zng`!Wڧ)luDOIDOIrmɐ#!>JA$;%wyXqʵ”s-Q ݷ)r`G0ĶL:LطwgycVb3~R"}Lq0>~,~I$>H AÄϤVoi3=Uyo2EBZ&Z%⫼!xM/%sr5H^?+QJ"?c37w,;_3!E{ k7Y>c7- 8:!Q b\![%UƘht&Ҩ9PQw! ͵nY Śg(gdAkO x,#7m!TjYEGHL"D|zOc~>1-]>=O|B?=5Y~zGܔ/yM`!D ~[ rü}*50acЫG3f6Wd??|Z%t@ϯOEKqT+.y'cIU&u :ފ{c2Rs^77ڽJE^z2nחS5|;Κt,oҽt?[6sdw멄O,G\W#+=|hn] -OfLP*'6At碼/+5o0S3z2qoSֱ2p}E|{l 䨿4M f Ư|V"o7R+Ѡ1l\*H)M)ZC^HT2g[vj1Ѭ(+ V-qʣF16[Jќljq2R50AUX’[PgBY- iq:f`TyFG  #QEca~GtU[X,xi9RJyX;V(S-yV/@j3|+`LF!ꗌJ$QwRiXA 4$ƪ!hFA 6C$5LYMhX2ewE$㠣!(VrTE"N%S5%Mn& ͅCX@1:m&q-O<$` GUmF J);KD8xfdx;OȫP;'XΥ8Z%6I#kc93dduwXO>xa;Aqw{Jrzct(_FK:*P< ~ (UXS:WN1:=u Ti개!L&sҵ/O͗ϱsdm/_+*3 WHKlx%0`ㄈCSk 95) 5X3Òg@ lه6[WMoҚ4#Ԗ`bVG)8BD)b2h-uf)Y#@~X5`CΎW#ܰX y.P)l) ֧i:5͡H88@7pY DN <Hr뙞jK*o8xW[/5{L,žty4̊=7Gp<_8k` 1!Ywg`_ ~J<[eiܡ㫻;L-WJju1Ե`OI1@#>͒SS؆$}dYҖkb6ۂF$ſ&{l^Z41b8 hȼCtJ01 anw/}r s49XfVhR[_-wIC;د\6w"Lw8-7-YlJ&׊l>cbŭ6n zj%NPPG9\V9*OCj@ & *4 AJf &\(Au]LS9Jit.l:HDZb?G]C`=5n/ǭkJXS63V!4l(̞L|u\ YY`Zy.߁hfч" Rǧ i+ 3!阄]HD]N4"H {`)k K.*,~6`Sxa`S8-}Z)\ڡj ?~-T֠DBHMz˩F.}M~w/E|BVݨ, +_`45#Ix>'5mzt>YT BMO'1v:@X˜,iV jͩJ"L{jRIN=l/M_~C-%;釒]0g> ?NFyV/1}Ԙ=@Qt@e5s-gDe kѱQRx"% jKD A" mRK~( !z< VqFUNEۄBm jpe+Cp. 8PRˍWJD%GVQBe徾X o=4:P0V tV JtAZNKʪESV|.EJ[pnf bVwi(C fFϮl]8LP<&P@[-XI\4%RBi}.gte4L<,32 9 4I< /yfb-?x*Kq,>n~ljs9vRe-nx; 3NcC@8)yŀ9i3|_\"(n8Hjs1{{T{f@}?_܊3>L>q2;sWD,ŕPl]&nhY?,OzKw?^9[}n2`^tmcq;;YJƈm3/A\ŶZ۽/<L>+ \h| BAOEW'TĖHሔ,Hfe"f%_nKX@p;K&^V^L:/deh0?u`Yt=ßz_|0DD sxAlB(:o|M.Nu3>*oܤ]뿦 |AO_e2,Y~rXgls%+sA!Jꫡ\}^o5L l Q@sZH%Vn\RYB,eIaJp+S)1ZRn.y]؀">l -_FY]{蛼wr7^-c?@olq(GѪ}Py fՈ.&[') `V*3R)A.E-*սNA~ݚ/4*?&k"i|xI> !_ڊIbXR•B ؁D "71WHĥ9I$DcQٴv- 'mfj+6H`BFj |n@5ZHP_CN2VaMIWE؂_0DF$Tucy''m,ھ20a$* ^ʒoT1K夃O.O0!q{4$00E-vmg\}I)Q8hqDݶ?I `^i՛f6ӭLN.{y1SF\oy8t۾ն%u~WL{ofA`$ b*Ǭ1P mϤڇx2.)3<}oo .w{(̅+u|@6^ݒMzVRHRw--' IR>< y_-pVZñ3OY@$ٌ-K~帖2Tu #wZ0ܫV79*M>vrxV4<G\^Zސsj7si84)T ?={Nt46z V1jiꎣ1i5 8DѬi9dQNO'?yAc'rt첒wG (NJϬ=w{# pm ZM0p‚a8zQ@8Hz$(!F<#*B@KB@;I~M}8}dn;s:+$҇̚1ryUqO9ǂK`-1I$,ac-00 &:<_q*4(V}ޥ=::2 =[K_MW2/4gXưR)C*ִ/ԕPQ &zI9AO%&BEY6ꋲ:/, X=zjYkءqaۃgM|8Ш᳦Qqجa$GM%JFơ`*4LIŎ[%F:14DBLR})`a>Jr^N-vkui޴dyֳٷhޏ3`[cF=3( c]x%Go:;r?"]Wqy4}^DVWY)Lzmy|GBp )Gqh7 ؝v Š脎QE.2<łMZV!!_Tع$BTv ĠQItolg3+dN)|,mb |MeI%-ւlrP6(tI:A,iDȎ ~vvvvtIm}zë]Z9m7q l3u 5/7v4ĽaZAgLוY*h;&9R_g0֩ Q!A1XǐR R %B9RLB {H٩y.]p"c?adIef37Ϧ8g;w&2z3r&BЂqfϮ@dRwTR).hhh o@ס .n\aͺ+HvLWzn_W?}*/S7{w0_MbinGz W{7M^wY_?yoa=_>wˏޕJ?uz}pos7FΤ+W_Goǰ'oB cXE/9ȫa{:f7&s>Vd>wO^ƶe< s WiLXFz 2vnވyVQA.}#«{U .@; Jh zYbh&K,c0}iM  =qG<0II0Ihy\I;L| D[+w;g5 fFHlB1wYĥqA͎/o)hX~)j.$%E,N .嵀te6/cX%rI3%o!u6UicxKM_Qk4WVZjwL6+6AN*&%I!*R-BDž܊@?ZBqP~vӝ~2\-e,y'uk7Fѳ!*kԏ_ZkBQJ۰6퐕&.1oDK :=9U"Jj%.'c8WlVZږ}o+.0So([i M uGiUpe^23&ܕT%r(zJQ!T%U2yk!0c0/B1RI"#$+Lc߻) 7ӈ׿“fu%ٯn2 'ѬmMNfOZ1ɥndңT&CPS~n2n)) 072Ouu)og27~+2HtwEؼԎGN!fYϦ*XvM-۶u),R0O_iOMa͋ ~? .)df)f;zO_^+wآ*̗J KyG%"R5m7W5Mbݐ`n{Svs5:qL(c$h\ .fԆ&Rff ?18`qnβdқf}+l$1{`?3*WSFPsyY 0/Tv?f>u,$mH'~J1LgAޅll'qѰ '0I0A}{ O3D9T7/>MZVUQ-3,o=fVAܠ$םwf1J<66\WMkfuS9O#;H=?W3cPd_PS7-GboP3R܌˻!%_Ai>Rp9cLV;H<1i!DҲ9|IXۣ;bo%I WhÖ70Z )Dƾy| SYFvtX7OK( tTćy>"MsAzp!Ĵ]5Lw~2C{n61kٛu*JӄV6z&3*X@OC⼾Q\MJ_f8( +mn*!O(q"[X,PA~Q~mk!^^:]81WTx+. qL 6!ڈ8{($x8TJ.k$8A(_R3Mm [1:VE/-b*Ka W>&RszJ9ѫhuvʔ}ɿOaĽcrB 1`*w蛘{i~mΧO\DQa߼É q?qUi/amPvFpubozYyg}Z"Z)q 1g{?rӬ3elcqmKixx 2:+On"l?_(MmSJeO~6Y -HiCJ2`b+܊農nP2a&]nuEc} C"1Ǣ&T,J8"&' u(ddYQ"@m^|Y |I3GOsBK:ר]#wQ[;j>(V' +GI$: zIrtInCʑ:Eܝ #@bn|e {xR%+Ꙡa="#GNC9%?~h ߞ-+h#n'`81hA ӛGYOj 6<*)g"vMxMGb @lȼ|$Pl82'7nprqNs%Qf͊occF+z$F7LSW kIH F7=0fMP2^sȜP"II`P>a d/xAp*mj$Yg&F4.tHmK.su;DRel[+ٙ3@ZDSD=O袧^c4=lVqlݎ1a>Y5k&$$+dJS&T06eԽ!xb3W^z(&@Q M]c*]Vm8wVgk"D= u^b7JX\ѽaE*~#aOEf+[XhaB/uXtUuق$=GuU]]d 1?UVYZR>z,8Rua_TdAL.aj%v9۴P@| e~|dM\U&.REr}Uݳ\?Y% wtQJA:|K4O.b+T~|(Np؜BƕkvI Q g(龢rP5qqѢK mo>PU')=\*[AH7- fB8Z}+M.XN=5#+m R8BN+oPA :Q};Zi%%S=W̟<-S$ g܂M(N\D-2J5GpjI4N2u:B;7w*Jh(.GTLӹy7HƮw8;z͂԰_*y4Kl` 1 KEMOwrw'҆s)2] 2i! Jd|>9IP䔗I%gj3y)C (NbT'2M6ƂQ7U>A$_4)j{̦y^lxc ؊s,f,>#Cs\2Bk9 Q܆En WL/8ehKޮ6 z1XOjOymPcED_Ł{mt: I(}mIה"Uvܥ;WQ9äbLGݭ?cs# KVXEd'iHB{I Ti\8i|،D}l6PB=%nZHhig;M1Nz֛SBEeCՊ< Hɕ{IV“}[C8jڶBM1yPS?UdAXfd|;@2އ3r2)ɲD!JM#hKIT]D}́k E.h$cf:ڐz.$n.d}d87@ {,ɩz #GJeR=|k߂H GjlLR%kW;JR';=j.]^6#-p j fk:;[%Lq9]X_yw@+ڃYNǨ$}E)zm<&XV}pI|`•Rp>|%RN[a4.cLS̙gmj0%9"1WR]VqBRDт(DsY}?u6 *5|~J6P=] GRq af RwL)icrBTv5]/Gq IN|g@,@J>ҹ;Z vDe4G,MO.K#soBTPrѺx-,(Њ:GDw nb5S2Z\-m3̥~a *[Eљ<)֦vy ]xq+i/,jfK쨉ޞ2jއ~}zRsĘc|^ṙ9zg-S9n$!;)ѱvۮ*!̘9 O& sv|ۯ·X2.Ķ3nH]g4'glJZkupUWgߩ;g :)g={}AΧKhka3E64s08g|/2ׅ/ˠ kĨEtNGtNrw5ZQ]V~C63Qw3L?zd~ožOA j.g`ϧS0G3Xgip 4'L7pt t&Ӹ$jɲ_ˏ> a O?>/ o6ѵLC}pm6XO'C޾jVɶnD8I[?mzҕҏ"F'DݨoI70LND 7wƊ]8sOoCGs-Gy ГtG;%]S蠠ao{nʙ`g3!n[[qXx).?/o~U)Y.JmY:d~n_篁n06 + a ?@x:^ U&v//椒x5r?\O,* I(^,ny8(Q1w3x7iv o$U'54|[id#lQ"72fqV|9_=ƷHl3"SnZ*5\Ƅӗ?}MWee[5WrOJT⼲Y~p8T& _)ew"9S^Obp|0[?PhV5qp|m6!Jj:6sW$S:Af/ }]5OFHr\VoJ)pPk;%`&15M|(2,lA0akO[9pe1+yFċDh${E{KY{uwaWF0;{a4b{;/%\ЌcF"$H<(IHz:-3{:()}cd%TL `"YCBELUːY,&$=(:%ݲl:}Lͬߋi6a)ܳA6lXDx*qD*OZJdX$%Ԃ <M&`$@x'dݱ 0FnؤMªԁ֒k,͐qL#rÐR QIi^{JD(D8QD yցrc8Z IM˜E3+pвB,it[p EkV01b4 ~Йbzd! W""KoD5$!+X+?qwLB%)lCHlX(u,KA@e B0ĠljdEBD"& $`4`8!4P(U;u2z=#[6O^Bd$ʛp|*=}ݲ p=c<~5XOfcG Oz7KɵS/~8)ƛi~[!*:؊XkPω|zC=z3abƟۈˁǸ_سsu'(g5;وܗqLǐ(U\p2uk-s'G`8^ԩ؅\ʈa6cj2xG 9NߧR`hc'Db "f:ě,n%Wɛ| s+_燡)f^L/f/w7Q̈́SX`NyY}hw?0Bf̥0QG+[o\FAAThӟ}q2xv0KyK/tsb7؃s twEW!-)ĪD+Mf][s7+,ڳKSfZov_vK`l%$c;~CJ)`.QJŲΠk @z~`Ѫ 3% [uvqiߔrީJݡV.2 \F]뇜9Ż;; ~LoyM|Sꠧӕ jaO;=!fČBՑv+,-6u"Jͻ;6uOdf&ZS$EZ:yRW)zP㞒dV&ɐzO_[x}I|`ψ(閇k _%-dMT%wrGeG)M $%iڝ]_I?CM74Gw41{J$F\F`B$<$̑ i~MDJ5<%Q6;QWl"1fLF~ؘ@`˥ vDcwa9spO5U^,_y[D XӴk=ևNcȀ>Q'r_ZWfOK&?"jꏈڝLj|;`sD6+zck|1eݬ{?r" ZNhfs8=I6Y1a͑S@by 1 ;2D Od-f|Jb9 I}$xE% 4,c9?Z:b a;l R}@lwIT] ^fL|׼hcqQ߿wGToARjjV%[bO-!vneZxm߶ʈҘX\O W%%xB#È7Y^rEű ')f^ľ'0Yb6d5Mdm֘hL/ĕ'm|!8G|[q*A1XȬ~sBbD$sǎT ]=H*o@98p۬ FRM#`,/SB(m 3 0RrrFӄYK$e* LN&E4#(Y;*8phdbhLfܵcͻT7j]N 1ar(}0ZpRI4 EdRjP%;XEDi@%e-<pCCHPij͉U=\rvs=Z^ِ?t<>Y?;y,_,.W?SL<{' ^Q3w`1cx M|ԑGxB:o3~'4e\r0f޸|y3t 6`ffq}Vow} U>]w,0,V{SבS׾ᅪjoΥBg[$7>9ߟBZ^}]<_|bs4\'[gKb(N%:ͳ/sPb~ŋB#֦LKH@h>8[\`mߦ'I?3Wu(I˫;@(_kX,(W2x*rmeLwV(BE4d= 5 `eER*+ %8-A\~bTakQOHbxϭnK;e@gU1w,Zf"7s3Jq*=eܚyrx5(n?|BEVCs]LKrwN=vp,7_0SzB-|q:c[5:Y  󉁴oz0pp` )"xW߼F ^=ZZ:PO#Z;lG&rC芣уxxY*B Fgr!C;ħ/Ĥ 1 {GE Oqm:H8./[as;r#;|80Er&)xL: 6@Z"^#fL`|249^ 37+Zf/9OQ haq"q=bO.yARrj < Ո'"tnGÐDf( ![;rÁ7f.lpe8՛Qg<TvoQ!dS67>H-yp7>sL}qrj[&NEw87迒~ }XΜ}ޓ2Vqtbj|?k&ˋwE‚Zͬ-=[.Np*\߮z¿͢?{5o(''/o2}_c3sOKʭ__cA_e_nf_dR_ &X_TܥW1UW/tp#=Q[Pn f?&Z5.u5cI_rlA {5"38`altcSE49y}"y)[G?-@bz2KuCjp#D.9 Nvrgr:Ɍ1GW$zT~8p1m1ї);.D:CXH_ kp74|u1ӷnfOGC 1?is8`:: (hc(]ip~E"FfG0~yqT})IЂ<RDOOˊ0Dt`"& ݍmA$h2[4SO Ŕ :#_{0Ryۻb ۯo_K4Z]jSuZ\Xwyq1,9j{qƹ>./漨cR'** dY8Qȕ9: (%X`_5$/o/go|NY^ζ@b{"mۄ{L~ygVmNWnuUMR[-pwJTE8Lp,q6ZL + TbŃn}XAz,\Bh(-LImƔR0悖iw4--=0i,f8AO'o_Àwmrv2%@a% {ٕ?F^@qWNPD(Ui4sY+mȌ30?{Wq /^oc>&OñJ!:Ih@ [уj&#EhvU旕ityq-ydkݭ[ek`vl @ 7E>;EA3 Wpo7ݧubt^hݮg>=6\wO\8u< >~Ue&.!<ʊ}Ÿ~p`ni0·0>;S\}i; Cxq *0F^~{tG XV.OltB?sON-:;|Rݐ.Ë?>8PqnDk״0y68xv2[^Bo]MÓuВ!`< ַY_w$mR;43WFTb6:k(n=9x[ 5ݍFH%n]<|6mn/LjVwᣟk|8S*K$͒N6 @}hl|HkV5ľ.Tv>yA5eVG{}-<ưnMQlJ{Zw>PAdz@&bL]>ӄ /tv(mߙ[)uu>19,8Jן<'HuN њ2Y}Ja"ľyW iiwxZXKسݸمVcּڹCk_N;3mΈRLd@‰נw¡I 5.6UįJ}l*\za>|{tO5 ’1.G$Ezu .)YK'\~ )j"~ܝ5-: mx2oI?mZi|~*^¼Ź&IOSNjSi0" 6( lCq]ufz 5D"Xb 䒿t0IJa`AzS6Io(DQ|SbHc9rz ĈqϨc^+1RLs=ZIs$ld݁j;k!u׮toqhj" gh5R2aq\(#7sLGBx…= b^&W5Jql+ʩ8o]6!"u隣(#aa4H))-@aǕ"BA[ 1\`V9LظN! ]2%c_\fm/s9bQ~:7 =](RGvRʃZ]_n3h$Sz{x>a){J _AVn13af agV,?q,wL%|Q<^\q𸺾[nzu|ZΩ.WۉVaD~= >? \l\[ds /}a"9yO(qzb*aנy͠!d[+i"%9dpjm,(HÂxM$4o-beӚ ^R]+^_-Ga"6p; u.EU[Ύl(U\⋫f}n )(hZLӵO+Pw,ۂx6>MkTS2 {Sfdjڃnb}dy_kE!wgkz43jS) VBDz=CJqjV]8,v?iYe]TYwh5^٬``]qӷ;G4*.XQ4HQ$۠Zݩ|`٪Q2 Utx$xaN]I Mp& /4pCJŞۯ*UJikO , jߜ<'-#PvrbRnH&SkPq,Y#=e)ؗՏ-H, WX%n`uN T/i0־*faecLZs]uZ@ջR<&fݡ-P-8fҵ|1q·.X'OCU=0pd'ՙY.b>z8]MqtP@ڟr[e!㭆^1<ʤ.#RLY2'Q(,d%}S2 GP);tν1!o $ wz1q{Jd7཮83 Rb *O3,VGw0lѶ0 zRj0<\ѠE1g.~3C$T - 'bR*IErCE9'Drd YIO;vڝ7I$x2"q{{Hn\(:'jL,W\ 㕏*һ`^*n§`:֤.<Ilk^U%{^.Lʃv:-qʂP+^Cjch)@PMj4kP9'y0BX UfX.Jάpə%g(oə$1BNy"J"IP"%`=HX`Ll(^->4VUY8{>_Ms׎DC 8a- SA\4X"ed~^}Lw$Vi2H2!qTL2' Yώd  ~c0vWGKyk0̦| 1,82i0"&zQOV+BSxo_A1I Ϫ![;Wu4-1h$((m&t?p>xaI ` Z]q6 k2e&;^eQ!ˉ8VڎKnuH;1j'ӆPQӭ(έEN xQUPvLgZxIǀDYz@nxSƂRMRma` T!!"@^Gr9`fXXW3G+LW.RI,#*6$% q? !i| LX"&l !UTqJdٷKr)gLXd4Qb,5Vz "lyU[`X)DrEܨG%HCΊ 4kIBtQ@$u+ C rSnR'AT2Fp1SoKj@GQa0 /-~1BھГ{ޛhDhqE QkKX5I͐`&1@ (ȸFkӏTx;!Yơ{Euq] WԴNBlDR)xuÂ!nS`ƨ7^QVBzJ&YCR1-(¶%f-1eUNz} FFy%\4a]9H?BRr4re;J`:<$pz!.:N3%@Nj\X\7)V0{OJ:w\:YCɂ`f w;Y1c̆%%}"w=ʬIW!ЊԜNL0oX:Cpat/[l5f:#iyF,#<F-lO'GW8/HyAWKYx:_=|6HbuưJdR!9kphQȖd ?UlMUy srMa BN >QkE$o68Clp[K ~tPUi k!7RopOb0{*(#A56 ΰRNK t &p i 8 `[Ki0ͪ"@Mwcbn@av! P "D}#"F0 Ki`NMٔ4Zk,h6 A5 c"T`q {}[KTȀ\;#@!Cm!wJBx bj~KZ)u4]h :{wI&Lee7t> x2] n.4+oo7|2nf@l0%h~ݹk]g" Ɗ6 J℉!F !GFhOo3e!a(p5qRWKWw:%u `9 Q(+MeJ !AptS[b[զVJI!xׂ*YK9Lkx(GbrF3sl^gh, KC7/$)+f)hm}Jg{3< !ǑDІzKrl@F3$GĴ&Vޡ'}wػ_CcH08tCUb^e$z ՆTS60zC &eèP7HR5hhG6 HJΗ6]aEEWPä@̊`+d$SnE9 ɏ%s؄jͤkQ7iɝaC@BE4o$w(W5d*7@ZB Mo mEr 67VS5aļ5rP z+Z-{=%C:lݴas4UuA0J+ %Dl? . pv5\6fNx}ÑHLګXʮO.mJzO`YSdrF6-8::,A^_'y\.ŞOGzT.Pqlڌ}p g8=WhQ.͏609U6v|HK3JuAb`HJH]nl< /z4{JOu9d4VE٩VrA=P1gΦav:]S {y3u6 $MI0fv\Y*b/Y 5=)]D9pUI3Ԋ8l8id_;qb$ Ro )}ne`shE)͍2iM8K A7,p#8\KPKL0\,8aI;kQ7h/vVVOXZ/.$R!h0Ѷ)+I7{M^⹔Ykfyco(/,^~ϡ/xJN#Rm)V޾6i8[ldK{C| DL'(Z Fr¸dS?3 pOV2Nz]s w3a(7?}D'. @*UAA/e>p)*b'0 V.ϻ0wイ|0EhZ7iMp4KULo C>o, m"-[!|~D7H}׍(ե~,%s~x7lІz(W;6mo+է){KڔMG.Mry/gs;p3̄Y" y_ҥGEh۬/nƿ61tr t 1Ӄ67}ZGmo?>pTH v85S^~{z5 ,[bJs Ag.E8Y#Zq9# R$x sK2JM 1MZIɉɶQ +r^jR-- Pɭanm}7dp|kdƷ(,%2"ll. lKܦJg-zj>ήB94eddxyG:uFQUE#S(:{=mf#J'aiA 8A_?SsK@W<^R0a+{~Y_!8sM9.:ucEi//Cgb i~`~dR\^qH59b^.NWe&b4^E4RKCN=o3&3dNu]-qoAfK֓nnQq'A8'-$+婉4&ZAL{|(!;zqu /ש̯+ΦEb鎟7Zh*aݷvIITWg2m #4O_ػKW=ݍ.a*#,+|COӏoO 7xTw8םo$.(:|2nQj`{sdhGkͩ'pBRJênqPH7Cc6gĻmαjS2mB:m*-BSjMt!$G)t[B'8^lrTb&B  BVDۊ6Mebh!tqŎ)tٓj񓩘nB euk8&BPs*dP\0PMd}LTpRWȋ^LY eplUW EfNQ#_뚧5YzaBAN֦CT_ŭy.ŭ㑋[ #_2n!P[zhSD9:sUj}567LS5AtoT19e#ZqX4Ki <#B Uͬ"#ֈHE 3[4%ҧBlPS?NJE.]C X4"\x"ƈOJy6w_]@+ksQ*(q}g"#D-Ʌㄋs^q7}b݂&( [m yJQDDP!} !ߩ0Eet T+uR T:b=0ЄYa *OdԎ@C"H`.pkU! ?1_7xz MF 1$l@mN9X42h& xJ*.=5693\XgЁyz*Rp"TdT! P K0Ds ?Zʅ/~QPZ6AED)"zv1u]LGC)dv1E.k.n EW7Tf)<*& F`t1vJ|kV¾RH)+3Dž(+8O#\`b,qe5)xBl!TI?֙@Yba\>rJ!u|S1xf%k0ZĠLi4xk)O.aR B4Iǚ#cm{~Q0}Y,?j3wK`K`p&ORv^o%ǓEw+x'n6W }LVOb>~.|B2bwq޷}SFxRNfo”  p݅J<c)Q97}REViF56MBqrp:e>R4zԞ5h49[|5Ϛ`Dj7;6vYөaŦŧP22:^\}S#5]l5f+PWknfmqyse'{7yǓj=] ¢lqˆ<-s;=8ʷ2N7szuǿx2а;@C{Ur]/472|쾸90wo4mc@pޘ޽ ; ZMin,mHt;D6g.eVLؾ&˷/ƀ=pfm|fKCv}}{Wȫϗ7؜|7پY[ryOzѕҬ, P*j,5l:)߷.P 'T!Xהe^GuYVp/E+ܻ!<:9Lfk;2[& ]ʸsolǍsd>R܎EF8kV7''{G贬N}=o91 Ddt~{D =:M"-EsxxvYI&՟exjowLou胛G4NC#f;6kl݆X7zEMnk}|4s[N6|l\Ϧ Px~I \/[*GctY^FͪehB^ap[ ѡ# r_lst;mN-.x] nX<|ux8|ᐼk1[ h^w#G!ۙCϭ sTnH5l^cuDcqP=\`):Gbcd>Ӣ☦[Y47@{Epfm>y{at6odw.,I h10W.~Ywwz 5pbi(MTId,M@T`L<[vu22/FhAhœBiSQ0D:\H:lpBƶ,:<~?GߐTP+%O ^J@-SFcy){!U$*TU&(sUH[ke >҃Sڪ\,YV $H}UnC(L RD9KhݲV1? (? DpU9jK:ɖ J J )/@m^lAo/x1lJG0ԢO/i)EcFm=+vKin/)XB6RAEΙމ.^3c"W! K5 Q@p I`8(Q՞ !)߶#4 ah.IGJN9TPkVͧ7 ayGIă;-ݹ(ܿnh*Mc(L8滪{W!>W|{\N TP;VHp:O0|alr!}[i;B+͚k\\cnP)$")ӚY!KO,WX4̤5E0Q4yK%B hSj[5 vA4%ԋZRXx,p>M+@~pۆ}h@N$mH"Z6jA }F8puwѺ|xh`]Q_soG"&'MtA56,DAd<NP zE>[T%M;P t@3dSƎՉAB$ɽcAFKq/Tp^sIu֣ hdV\nXa B4a^IScz&AT5Uc^٩F,!r0]*'qUVQ_ P@<-o/m[̪W7E!ZDȹQdDS!1O&vB_xW9I%B 1̩F J43:'q "5Z\'PD$Ϭ5 .cCU*0-hh*W5! 5V$D<Ű|S&AvW4U5в- E}x 4nkT2Ǽ~^Yswۘ=@!es̚#|1_6$pIeΚ 2VA%֍B1o * nq̜-1sƜ 8}kg-Ս=Xq\1$$pFOLc7T4]ab11搒T1+l11BaY#c) )޿sѐv11栒6c&Q<Ƙs@I(oc&*5ƘsPI`X.Lr11栒 ]Rcy1Db+rbAwžĘBcy1w?MA11Rw1fB8cc9$1f&cc9$Pc&Q:ƘsPI`ū*c* Xčuc9$0VW311TG1s0Ƙcn'1fe11栒 ZG1c\ 4ƘsHIŘacy1ebBWc) R3"f=\/~\LVs |eD DKU2Or+Bk;^.]&_v]Oʾ>-^.g}7KXfzib Ao1Jv$2g,MYʕ: 4MlT\B `00eLRpRarGέ'X͇z=yW λgJ߁^Ovw0钂)k'U Xf൹'O@+QsnX\O޾_ޝLV(mxJ2>WR$hùbw?uJizn3QT~1翿[W9V}40b;i!1BbL]y3j :M*xLepJ)U0LgX sJs=Ն@#J?&͕]m(/mRd\h݃l`\d$SFx `lR >4i&txsr9Y^Чynk[6Rcؾ,iRij3[|6Y'}F'QP.1#V $&ɌԤiJ2jD&4Kx"N6㌂s{&H#x3[h)Ve2;6nm+~xhߵh}͆,-^ c֖(odc M;Ar{ 5CYp)hK6&TcQЊr_8_YqOѢo!{U0VPᾮM27V5TZ'tɋJqZz{"tG0oymCt :[WZV*n\c6q6kZˆ2ɠծqIYP2PRK!lK bu!KDȈ.Lo7*]»ۢ%0̖RYU7j#D8(BO븨lZ K #+ֈڒfu!dTf[*4GHk od?]6RV˥* :Kŭv6Q{!d^vQǣ޸E oZJ]u gz]m<:-:?Ώ~ݪ[4_ar{^E7[sJ\g/.ngguy\G'[p F]qs(ȎAt7Q3kWݶ\{c8&_|iok[`9xKzt6s_-VzщG ~?=N[.WSmp'ĄE6_EyAR륥StEQVvkVPFPBR(`D)@O{ysIhFˢ]Bxzx]A>[Bm~L?*W8Z ebQ4s`^7^ y^|JLpVrcS 'l0,UA)EEhjQ1iTMMigIR!ՇSzhȷ}32~-q߅]qoG~Zd-V'z}i%|,fj .E RJn6>۲JN鵬a4@1Jň&C|d&_ ';EQ~)ҫ4#RGIZߜj y_T'>:u~k_ _Uƻ7^˳j'/Ij̎˿Nm_/ŢT>ɄqLɼbrUɵAF=ggv %:я5jpɣ VmJ6GRAo wo־8߶rlٝOY(-cn"Xc&RYlO*i,wAơ6rnrp*rPwڠBKj$@0)19f0tCuxWG~П|]t-.0~+b}6B7ܼߧ*>ÿA;>EL%ޅغ͋0{Do[{vzL,#19= p&.<%_IT/_WoU>fU%Zu}ՍbeR/G=avfs{ ^ yG0.דѣ?J6}NBPO=::}̧r 7mEt8dGy>Z/~/hg8e<Ol<9Zχqzj'0)B;}k2*ZENfqIgW Jiaq˄g8`#<&I#αeTKS6 ~0*wa:1˃ TƮO FQn5$c,S9k+1VFjNt`5כ1w붶h+f͚/(fGKY}Fm{ܽ_iOtICzDR~Ԑ˻a$q֐dcT)l< `q98*XxIS+e퍢p5o "{f|8tpB4eGW ̈́gEPx6% < h͋Dxd(y,,X$(a K˰%)n΢ O* 6(I1A0r)%yL2CEg3T00(2E`4V=UWt^nhnoOߍލOn9e gwO|qڝOG8ur4KdM~kDDo^i1'q><.D VA@Uh^f 5X/lbxZITɄ,%jJ$Xm3m,*1Do^%UV#q8u2*Ź𤠸^$ jv^&<.6(I)KqgK^7rb8Lx nH'a'D142 =qŒ5訐 S0H' eSa<OFa *0rO%S$ho5q_Ή|?\W3*>A-Hb&W̞ײƲ}[4_7NR|U  Yb}?UWo^ )-3LSsIA qoFpi?w"7ӝKj55[UC;1d+z/~߿]H&ށQ;A7\$@ɠT+P9Q w# mٴo&Έ(X%ZQJ-(`LDn;J8.\4Ea/z^J(AYA(0_jQڋQ-)WElRKqv|0Vm1Uռ4j::y8{謸k!ytu^ʠψ6}8y+$lO$KuϜZ9.&}|quat^Aً^zcX}l~R5WzbN&~ivG>'@Kԝk4^|ˇtmtnTw} kjLբ}W#!+ty5>:ﳷ~ ZYqKY; ~G]ż:mF#wf^pBV"*oO2Ɗe͚@m_k"?d<ҧ]h;H#KtLjN}ǡxv1PѤCxcCa5+ڌi`v|OVdž`V>L;7Q[d Zt<46P@ƳB ` oۏ5lΖ: ]7aIa'lD)y2]V\#g>CR,ha9 ?گgD{c>%WNĈ.BsuQK`.N-se+ms!UF((@C*+\=r "1-4͛-7JT1oM+^V"]jy׊Q1G8)LGAV>\}9ɪg /*# jU1uU+A" K`a`7uȾ L@FLcCcqXd3rZDO=Z3Xk/;V$u]wurСM*E%*:UۤhP+j!T8~ST>Ʌ1ܬʜ8vH yp Nc(lHA)rڴnh:mehyBVCwږZN)cP+rWd5\1x:+Wދȃw_.,- 0ϕ;>;:M?4}`,A2Vm%$Z]g")$sfN'DdZɃwGh<Bj0)ѯ RRTKpeSpAm݈{d.޽è ֚A0H<f¹iW|js#'1؅ɛad9f+kE@VG pZ ޲ c! `r#NԖDh"V!@>Ap" oJ*;6P"۝ƻ w0wVB{4J9R)JXPu~T;RWAaƵtßs@$BRsgA-ܺ65ͺ ӳoTtu_(Yw?2^ (֍juŷV\?7)#Z易0}-r${Vpo0q T%gPHWK .ڲj3aVc()yꢘ=J27ڲQN-)7cE,3gpݸs;fKciDnk[;G7owgAʹ"'Jg.6hJN+݋l&N7f¾@k#vM!mY8ܷ]@MƑ%Ҵr`B!('sV@`1 fy- #t\ #lBHGnv^~-vw_P@Nq|ԔFFL0X:RYVh/(+3b9V&P)V_Qzkǘ&t>o_<ӛ80lipj9>V{|\%֜"dsR|HH:yx.A;.<`zsS _r7ѣܝː) j3XKɱi.uAK 'Q !вITT,2Bna.#)@uU;:@hsG _Yڍn؍dnZ.6R֍^8i& QɌ\Uс$r۠ɍ\خS#wk RC|'5X6,N?/i@tY9+/D#8QCGD%80"\zg_*Z ]#>Q{"T}VĜ1G0i NX('ROtfS(CPZ䇦pyAoiǣO{C !vխx griPrAI|!wTR6}E97AͣycǤWw4S1<ᜯ_VjqqaaaaS4h\qTrO7*9@^B`Q2QQ$PowU͂cqe89O;>7S&qz5w 4gAi6祋%G×J3D%wsStOx)K%.!ks'|eeoJ@arAخnƞd8s'^j^oEʬu=]?rm_WWǒIrtޣ<|>Weohɢ\ J#(HltLA wylPrPĨ5\OX*pl)L9j@6uHR%+NhVim$U\5.ĞZi^F%kD=K"r+lrʅiB#d* 2QjW&qB wl󀱭,ށ-1C"4[CM `h$-z5LLIE F ' la10U%(A5x7TÍM6\nyj~w"j?-py5}s~*-i:ǚTv:uA-64ls8j5`jA-&S͸ ð9gQ- ^.09C~R-phT){q6ADRڨښwf\2yX`f1zs؂ctxoZkǭc c5A#w5C}uV݇UMYL rZx6=ЏZA+!' '-~Sp+,pNddװk6աg˫yR=< -&]K krRrO.z2*JTJb|R.Cϧr}lVܕ,A|-- |4C av ɀ vSBULըᦓ$zըm'Eb3qJev?]n{"%P~! BTSt3^w2 ViNTےT ]ĵ#ճA[X,o'm\=[.ltpN~8Ys6 ;˷4pxpA%垡7[.j4]_@E~^{ ;<+Vڳ!qVQg#-i W$dJ}d(1XT BT'u62#ں nmh WD2盡}H!(<I.ۮv%ꡤ!l;T?5%q}p늹&v!u_4͜NL&gǪalSYe|eqY\r&1 ||A`Qߊ3 a&g+'cO6,ӽK'%aT$Ezj+ddw<|{W'}} o@'1viruJxƤeI3#%pyE.}E"T5'˽9b0vOYFE $%9)E<Sr9 +Gi_HN3r33 a8HF ?LVdPr7D5Y*ҋ=xKw„: ܻ4π츪jrϿ{r;#Lvsѷ[ӳm.q /9&r~Vetzl>Yd7-,0_G׫$+_3ƐBhv8U`¦,|)]L%d anQO\?ҥhd5?H#$|/gdj>DehFΙnr ?n$8D1r冋L+J;ϕKe48W y*SZ>.wl`1XT BT'u6`v:dFC[UN JS#HߙWv"'FjF3M: ( qzJ͍!V}ݓ'V . T@EL8@"#+Y5qV48^KBCD5n8_-D/Ÿ`gr)DnrB;(+HHU0kGݙu zƩ==8p<#o&)8  L6c"b*18I"<1J G_ ^.V 3XO!+{.&%]2,T>y'dwwW7kt EiS$t2 ; dxOV',JgΡgJ`vА$xQ̔zҊ¢Ҋ²l(h4Q(-@ݞ X/VB9N(kGMdn^yOzY`ףn.ncr@O>--~+W[aN*<0B2| (fV68' OWxBR2_}aab߯*Y'CX'Av9H)UPb3/ Օ4^Wdxx]ɼZ d#> >E}j"Rk)JN;ȿ G Wφ<Co傫vO? Ő`*)tĜ*, J%ɔ8BL[C gDV¨ˑo~t ? -Zz5DیR+tOB Wdpn(g[9[_cQi~vFL؉@"EEi}.6|Z EH("ٌpg3[dIENI#i_q0)}-JP;r1M%Lv!|h6q)@Kc| zfTWhXpvXr!L&LH 6d5>K*N0rcAe)?2 en5V_r{ϫEL 䞚X #i%U* T q12LrFΌR )*D;̐ªXIF)y`83F8Qㄿl,d-.}^ޏ̠t2go r(y\SIO7/د>7m;o70c2A~$/C U}[1I||,ew_tu\/ADLV#BO@d˻F>ޯ}R$JB}{rOD"NVRMjȋ>|},g ?<[u3A9$^'wZa;l) BkxW~F NlLFnˬW}14z5?îV&vJ ULnA)rps7P\eI*`gU`Xd,`|DJs2,2I#=s3"(7q= ~cN 5QӇ$!߮?h< 3%tRz"bA`cԐs> "`9x!Bc_avpmVM1v=яdR7&m˧{b=я(,%KlП;i'?B6\cULcH@7 ꦅk1q;=+ 4,ox `Q.#-x"Y/HPg!aF1xF (1fi#BZ;BҀraH1>ɢoN`RE5bЌ4VnX*\.Y!\VX8}KZ28zEwza.Z+4西YAa^XGaIF-24{ R#(rT4 Idu~8*6S1:FL*TW"]7LX!/\EtrXW#uK DuRLjny_^p-PֆpM)qY7gW+b:cX!wں%3jА:u ϔJQ8,XG7vh `IGT|P ["6A晖dg!Q%=S:3m8'NeYCC[ 2rkrhzW3p$FX⡺o;OFGA@3B q=B4>Y)T3&A{aUrT EpðrrTKNNݥP #)\MG7$;d`"A%YcDU 8hVXj0 lAC^&_bb:cX}/DN-@ֆpM)tң?d4eR1Q1bۀ{4Y{[UN)-9-EK' yQ:Ao h2}nF0DEEB۾oUW;w&ts{;;.>GnJ4' 4]~2!R]./6C~=,Q`+xՇjhp̘ )+YDkw]^a$Z>y'0'v E0oS$D(Y mu4NCJv~ﲓ$Q9qF0Q#Fr$EEWRCϖ6t]l\icAzE/ir#H>jӣxyE9#@kU*섲fz<.•k;TP7 &JJM2QEeN;8 QLh~"'GL0w`*T|P)).eli?ggGR0EMVVoMӶ%8j>3ڲcKOU4S"ˈ X=6ei&jOOyzBɫ@.wI ZOI|8L|/~b}b+a;x3#>A aU+c_d{bHɎjے&]MW ؞8Q,* '/}Ocղx^-\;IҲl^g׌33I!Zyqs4ˆB\: "6&F(AysChe*xAD{ r<6ЧR!#X:lVJs, \Jqkw*c;@#lI dFfR:۞m gqhakx²3v~Lj&ݵ`7BJt6ݗ 12079ms (19:42:18.544) Feb 16 19:42:18 crc kubenswrapper[4675]: Trace[396780574]: [12.079511159s] [12.079511159s] END Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.544165 4675 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.544129 4675 trace.go:236] Trace[1806945720]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (16-Feb-2026 19:42:05.839) (total time: 12704ms): Feb 16 19:42:18 crc kubenswrapper[4675]: Trace[1806945720]: ---"Objects listed" error: 12704ms (19:42:18.544) Feb 16 19:42:18 crc kubenswrapper[4675]: Trace[1806945720]: [12.704085514s] [12.704085514s] END Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.544241 4675 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.544973 4675 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Feb 16 19:42:18 crc kubenswrapper[4675]: E0216 19:42:18.546966 4675 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.547631 4675 trace.go:236] Trace[532417805]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (16-Feb-2026 19:42:05.653) (total time: 12893ms): Feb 16 19:42:18 crc kubenswrapper[4675]: Trace[532417805]: ---"Objects listed" error: 12893ms (19:42:18.547) Feb 16 19:42:18 crc kubenswrapper[4675]: Trace[532417805]: [12.893734246s] [12.893734246s] END Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.547648 4675 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.645444 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.645502 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.645528 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.645558 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.645581 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.645604 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.645628 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.645654 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.645679 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.645735 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.645766 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.645803 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.645835 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.645894 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.645897 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.645923 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.646013 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.646038 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.646070 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.646094 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.646114 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.646132 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.646125 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.646159 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.646180 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.646207 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.646257 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.646243 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.646277 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.646307 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.646364 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.646400 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.646429 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.646451 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.646472 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.646492 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.646509 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.646527 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.646538 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.646546 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.646634 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.646667 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.646718 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.646745 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.646754 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.646792 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.646816 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.646823 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.646885 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.646915 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.646921 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.646945 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.646977 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.646992 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.647008 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.647035 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.647064 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.647088 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.647114 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.647122 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.647146 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.647178 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.647207 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.647234 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.647240 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.647290 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.647323 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.647349 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.647379 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.647403 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.647428 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.647454 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.647477 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.647482 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.647505 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.647544 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.647570 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.647599 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.647620 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.647633 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.647643 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.647733 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.648646 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.648747 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.648790 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.648823 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.648858 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.648895 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.648916 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.648946 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.648983 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.649008 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.649026 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.649040 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.649080 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.649116 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.649231 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.650289 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.650631 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.650648 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.650708 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.650894 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.650957 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.651057 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.651055 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.651116 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.651255 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.651165 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.651328 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.651632 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.651660 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.651879 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.651927 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.651952 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.651958 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.652093 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.652138 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.652198 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.652196 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.652209 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.652279 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.652398 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.652514 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.652536 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.652679 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.653137 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.653166 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.654159 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.654542 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.654649 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.655052 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.655310 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.655661 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.655835 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.656153 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.656316 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.656669 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.656732 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.656758 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.656788 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.656808 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.656827 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.656851 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.656872 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.656896 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.656918 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.657057 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.657761 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.657783 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.657908 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.659379 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.660219 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.660225 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.660412 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.660667 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.660746 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.661077 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.661111 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.661149 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.661198 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.661327 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.661342 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.661548 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.661559 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.661587 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.661594 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.661637 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.661660 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.661682 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.661723 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.661790 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.661824 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.661755 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.661849 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.661877 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.661978 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.662056 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.662060 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.662203 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.662309 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.662453 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.662596 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.662636 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.662728 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.662758 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.662400 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.662785 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.662805 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.662845 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.663258 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.663285 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.663318 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.663341 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.663816 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.664030 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.664068 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.664098 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.664123 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.664146 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.664943 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.665007 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.665035 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.665062 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.665086 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.665108 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.665130 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.665153 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.665175 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.665195 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.665216 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.665598 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.665900 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.666124 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.666312 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.666372 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.674852 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.674916 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.674949 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.674972 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.674994 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.675017 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.666944 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.667276 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.675510 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.667492 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.668832 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.669187 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.669483 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.669664 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.671496 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.672228 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.672502 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.672778 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.673006 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.673248 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.673521 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.674541 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.674927 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.675031 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.675310 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.675408 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.675772 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.675770 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.675788 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.675815 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.675859 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.675889 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.675908 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.675930 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.675951 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.675976 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.675998 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.676022 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.676046 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.676065 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.676075 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.676087 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.676130 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.676166 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.676198 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.676263 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.676265 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.676311 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.676339 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.676367 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.676396 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.676420 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.676448 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.676480 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.676510 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.676540 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.676576 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.676610 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.676634 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.676664 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.676762 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.676792 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.676821 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.676850 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.676875 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.676903 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.676932 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.676959 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.676979 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.677005 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.677029 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.677054 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.677076 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.677099 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.677124 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.677147 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.677170 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.677201 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.677223 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.677245 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.677266 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.677284 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.677305 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.677350 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.677442 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.677471 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.677491 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.677514 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.677536 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.677559 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.677576 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.677595 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.677616 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.677637 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.677655 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.677676 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.677714 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.677793 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.677822 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.677851 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.677873 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.677904 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.677925 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.677947 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.677967 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.677989 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.678013 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.678036 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.678059 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.678078 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.678101 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.678233 4675 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.678249 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.678261 4675 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.678272 4675 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.678286 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.678298 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.678308 4675 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.678317 4675 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.678329 4675 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.678340 4675 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.678351 4675 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.678361 4675 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.678374 4675 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.678383 4675 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.678393 4675 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.678405 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.678414 4675 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.678423 4675 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.678433 4675 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.678444 4675 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.678454 4675 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.678463 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.678473 4675 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.678484 4675 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.678495 4675 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.678505 4675 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.678518 4675 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.678528 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.678538 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.678549 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.678561 4675 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.678571 4675 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.678581 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.678589 4675 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.678601 4675 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.678610 4675 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.678619 4675 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.678628 4675 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.678639 4675 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.678653 4675 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.678663 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.692503 4675 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.692548 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.692573 4675 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.692585 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.692643 4675 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.692675 4675 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.692721 4675 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.692738 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.692753 4675 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.692791 4675 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.676896 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.676922 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.677140 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.678015 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.681484 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.692886 4675 csr.go:261] certificate signing request csr-pt8sf is approved, waiting to be issued Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.681741 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.681944 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.682137 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.688489 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.688619 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.688732 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.688963 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.689120 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.689202 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.689209 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.689301 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.689494 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.689549 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.689638 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.689814 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.689855 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.689868 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.690008 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.690297 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.690465 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.690471 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.693210 4675 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.690657 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.690770 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.690852 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.691032 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.691082 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.693258 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.693431 4675 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.693445 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.693458 4675 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.693469 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.693484 4675 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.693496 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.693510 4675 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.693523 4675 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.693535 4675 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.693546 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.693559 4675 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.691209 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.691633 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.691824 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.692451 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.693626 4675 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.693637 4675 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.693652 4675 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.693663 4675 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.693674 4675 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.693684 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.693714 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.693727 4675 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.693737 4675 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.693749 4675 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.693767 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.693779 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.693790 4675 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.693806 4675 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.693822 4675 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.693838 4675 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.693852 4675 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.693869 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.693881 4675 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.693894 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.693908 4675 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.693920 4675 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.693930 4675 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.693941 4675 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.693952 4675 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.693962 4675 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.693971 4675 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.693982 4675 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.693995 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.694006 4675 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.694018 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.694029 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.694068 4675 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.694080 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.694090 4675 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.694100 4675 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.694111 4675 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.734991 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: E0216 19:42:18.735145 4675 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 16 19:42:18 crc kubenswrapper[4675]: E0216 19:42:18.735234 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-16 19:42:19.235192811 +0000 UTC m=+22.360482367 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.735451 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.736341 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.736490 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.740409 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.740769 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.740785 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.740790 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.741033 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.740848 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: E0216 19:42:18.741083 4675 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 16 19:42:18 crc kubenswrapper[4675]: E0216 19:42:18.741161 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-16 19:42:19.241137795 +0000 UTC m=+22.366427351 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.741469 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.736197 4675 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.741572 4675 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.741598 4675 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.741613 4675 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.741626 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.741636 4675 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.741720 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.741788 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.737306 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.740800 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.742065 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.742176 4675 csr.go:257] certificate signing request csr-pt8sf is issued Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.742736 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.742812 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.742857 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.742879 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.743327 4675 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.743487 4675 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.743510 4675 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.743524 4675 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.743535 4675 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.743546 4675 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.743556 4675 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.743570 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.743581 4675 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.743591 4675 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.743602 4675 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.743898 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.747500 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.748048 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.750925 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.750948 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.750961 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.751023 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.751160 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.751208 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.751316 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.751679 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: E0216 19:42:18.751735 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 19:42:19.251707681 +0000 UTC m=+22.376997237 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.751941 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.752002 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.752300 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.752479 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.752550 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.752566 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.753822 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.754826 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.756154 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.758146 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.760471 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.761522 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 16 19:42:18 crc kubenswrapper[4675]: E0216 19:42:18.762077 4675 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 16 19:42:18 crc kubenswrapper[4675]: E0216 19:42:18.762104 4675 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 16 19:42:18 crc kubenswrapper[4675]: E0216 19:42:18.762120 4675 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 16 19:42:18 crc kubenswrapper[4675]: E0216 19:42:18.762179 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-16 19:42:19.262156924 +0000 UTC m=+22.387446480 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 16 19:42:18 crc kubenswrapper[4675]: E0216 19:42:18.771261 4675 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 16 19:42:18 crc kubenswrapper[4675]: E0216 19:42:18.771310 4675 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 16 19:42:18 crc kubenswrapper[4675]: E0216 19:42:18.771330 4675 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 16 19:42:18 crc kubenswrapper[4675]: E0216 19:42:18.771415 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-16 19:42:19.271385097 +0000 UTC m=+22.396674653 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.771704 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.771763 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.775074 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.779170 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.788854 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.796947 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.797057 4675 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 08:02:32.386693862 +0000 UTC Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.806080 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.844453 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.844514 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.844574 4675 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.844602 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.844613 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.844622 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.844633 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.844641 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.844649 4675 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.844670 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.844679 4675 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.844711 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.844721 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.844731 4675 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.844739 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.844750 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.844758 4675 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.844767 4675 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.844791 4675 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.844800 4675 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.844809 4675 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.844817 4675 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.844825 4675 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.844834 4675 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.844843 4675 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.844867 4675 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.844876 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.844885 4675 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.844892 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.844902 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.844912 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.844920 4675 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.844943 4675 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.844953 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.844964 4675 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.844972 4675 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.844981 4675 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.844990 4675 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.844998 4675 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.845021 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.845031 4675 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.845039 4675 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.845047 4675 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.845056 4675 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.845065 4675 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.845074 4675 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.845098 4675 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.845107 4675 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.845116 4675 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.845125 4675 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.845136 4675 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.845145 4675 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.845154 4675 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.845178 4675 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.845188 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.845198 4675 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.845208 4675 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.845218 4675 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.845228 4675 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.845251 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.845261 4675 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.845270 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.845278 4675 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.845286 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.845295 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.845304 4675 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.845327 4675 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.845336 4675 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.845345 4675 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.845354 4675 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.845364 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.845372 4675 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.845381 4675 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.845404 4675 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.845412 4675 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.845421 4675 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.845429 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.845437 4675 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.845447 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.845648 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.845712 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.890935 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.891060 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 19:42:18 crc kubenswrapper[4675]: E0216 19:42:18.891096 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 19:42:18 crc kubenswrapper[4675]: E0216 19:42:18.891229 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.891340 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 19:42:18 crc kubenswrapper[4675]: E0216 19:42:18.891397 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.894857 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.903093 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 16 19:42:18 crc kubenswrapper[4675]: I0216 19:42:18.910242 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 16 19:42:19 crc kubenswrapper[4675]: I0216 19:42:19.018403 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"485f317298cfc6aa62d3e76690beb20b3b338a15a7ff2e3977bbbf41a1aa4b72"} Feb 16 19:42:19 crc kubenswrapper[4675]: I0216 19:42:19.019701 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"0f3d49624d5318ac313902c2c859ba1e3915ad345a893fa028768fd3614f947e"} Feb 16 19:42:19 crc kubenswrapper[4675]: I0216 19:42:19.021232 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"0d0c0c560810b0cfa5e6b6562310232d2de716c8057091155ef8ddfe6e23c5ab"} Feb 16 19:42:19 crc kubenswrapper[4675]: I0216 19:42:19.072724 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 16 19:42:19 crc kubenswrapper[4675]: I0216 19:42:19.092484 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc6bac98-79c6-4970-ba8b-7996a4046f7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055f2d1052fed3e5e8d660243c3d0595e3168a670ea7a22ef51b199528ffe826\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beefddcc0b5e2e84c15063f467b9043ba9caea11fbc3f7fb8fc1ece3b8c8f75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dbd2e87a95c7964a1bafb765ec76d7f18e180cc0e47df9fde7b526b3717a107\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cc4f98152f284333d41071adaefada973e3396c3cdb2c6ee3c662f0043fe65b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c762d1facdba3325880ae6b1e2704c626ae67915808b527959795774e2a1ba62\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-16T19:42:02Z\\\",\\\"message\\\":\\\"W0216 19:42:01.228572 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0216 19:42:01.228958 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771270921 cert, and key in /tmp/serving-cert-1169484463/serving-signer.crt, /tmp/serving-cert-1169484463/serving-signer.key\\\\nI0216 19:42:01.683881 1 observer_polling.go:159] Starting file observer\\\\nW0216 19:42:01.687387 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0216 19:42:01.693158 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0216 19:42:01.694799 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1169484463/tls.crt::/tmp/serving-cert-1169484463/tls.key\\\\\\\"\\\\nF0216 19:42:02.048422 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://344f8bd7bee340cc040a2556a0e92c055ed7fbad886f85750b66d21ace766e8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 16 19:42:19 crc kubenswrapper[4675]: I0216 19:42:19.121991 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 16 19:42:19 crc kubenswrapper[4675]: I0216 19:42:19.139440 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 16 19:42:19 crc kubenswrapper[4675]: I0216 19:42:19.170326 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 16 19:42:19 crc kubenswrapper[4675]: I0216 19:42:19.191784 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 16 19:42:19 crc kubenswrapper[4675]: I0216 19:42:19.204889 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 16 19:42:19 crc kubenswrapper[4675]: I0216 19:42:19.223415 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 16 19:42:19 crc kubenswrapper[4675]: I0216 19:42:19.248859 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 19:42:19 crc kubenswrapper[4675]: I0216 19:42:19.248911 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 19:42:19 crc kubenswrapper[4675]: E0216 19:42:19.249026 4675 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 16 19:42:19 crc kubenswrapper[4675]: E0216 19:42:19.249047 4675 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 16 19:42:19 crc kubenswrapper[4675]: E0216 19:42:19.249095 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-16 19:42:20.249076415 +0000 UTC m=+23.374365971 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 16 19:42:19 crc kubenswrapper[4675]: E0216 19:42:19.249163 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-16 19:42:20.249134777 +0000 UTC m=+23.374424533 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 16 19:42:19 crc kubenswrapper[4675]: I0216 19:42:19.349330 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 19:42:19 crc kubenswrapper[4675]: I0216 19:42:19.349434 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 19:42:19 crc kubenswrapper[4675]: E0216 19:42:19.349450 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 19:42:20.349427706 +0000 UTC m=+23.474717262 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:42:19 crc kubenswrapper[4675]: I0216 19:42:19.349488 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 19:42:19 crc kubenswrapper[4675]: E0216 19:42:19.349559 4675 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 16 19:42:19 crc kubenswrapper[4675]: E0216 19:42:19.349576 4675 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 16 19:42:19 crc kubenswrapper[4675]: E0216 19:42:19.349578 4675 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 16 19:42:19 crc kubenswrapper[4675]: E0216 19:42:19.349589 4675 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 16 19:42:19 crc kubenswrapper[4675]: E0216 19:42:19.349592 4675 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 16 19:42:19 crc kubenswrapper[4675]: E0216 19:42:19.349601 4675 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 16 19:42:19 crc kubenswrapper[4675]: E0216 19:42:19.349633 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-16 19:42:20.34962434 +0000 UTC m=+23.474913896 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 16 19:42:19 crc kubenswrapper[4675]: E0216 19:42:19.349647 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-16 19:42:20.349641211 +0000 UTC m=+23.474930767 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 16 19:42:19 crc kubenswrapper[4675]: I0216 19:42:19.748354 4675 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-16 19:37:18 +0000 UTC, rotation deadline is 2026-12-13 17:51:30.916846239 +0000 UTC Feb 16 19:42:19 crc kubenswrapper[4675]: I0216 19:42:19.748413 4675 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7198h9m11.168437841s for next certificate rotation Feb 16 19:42:19 crc kubenswrapper[4675]: I0216 19:42:19.797347 4675 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 10:36:51.041782971 +0000 UTC Feb 16 19:42:19 crc kubenswrapper[4675]: I0216 19:42:19.892641 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Feb 16 19:42:19 crc kubenswrapper[4675]: I0216 19:42:19.893294 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Feb 16 19:42:19 crc kubenswrapper[4675]: I0216 19:42:19.894948 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Feb 16 19:42:19 crc kubenswrapper[4675]: I0216 19:42:19.895731 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Feb 16 19:42:19 crc kubenswrapper[4675]: I0216 19:42:19.896932 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Feb 16 19:42:19 crc kubenswrapper[4675]: I0216 19:42:19.897535 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Feb 16 19:42:19 crc kubenswrapper[4675]: I0216 19:42:19.898307 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Feb 16 19:42:19 crc kubenswrapper[4675]: I0216 19:42:19.899886 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Feb 16 19:42:19 crc kubenswrapper[4675]: I0216 19:42:19.900798 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Feb 16 19:42:19 crc kubenswrapper[4675]: I0216 19:42:19.906793 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Feb 16 19:42:19 crc kubenswrapper[4675]: I0216 19:42:19.907595 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Feb 16 19:42:19 crc kubenswrapper[4675]: I0216 19:42:19.909021 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Feb 16 19:42:19 crc kubenswrapper[4675]: I0216 19:42:19.909561 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Feb 16 19:42:19 crc kubenswrapper[4675]: I0216 19:42:19.910151 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Feb 16 19:42:19 crc kubenswrapper[4675]: I0216 19:42:19.911161 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Feb 16 19:42:19 crc kubenswrapper[4675]: I0216 19:42:19.911765 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Feb 16 19:42:19 crc kubenswrapper[4675]: I0216 19:42:19.912776 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Feb 16 19:42:19 crc kubenswrapper[4675]: I0216 19:42:19.913243 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Feb 16 19:42:19 crc kubenswrapper[4675]: I0216 19:42:19.913960 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Feb 16 19:42:19 crc kubenswrapper[4675]: I0216 19:42:19.915176 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Feb 16 19:42:19 crc kubenswrapper[4675]: I0216 19:42:19.915775 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Feb 16 19:42:19 crc kubenswrapper[4675]: I0216 19:42:19.916929 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Feb 16 19:42:19 crc kubenswrapper[4675]: I0216 19:42:19.917445 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Feb 16 19:42:19 crc kubenswrapper[4675]: I0216 19:42:19.918541 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Feb 16 19:42:19 crc kubenswrapper[4675]: I0216 19:42:19.919039 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Feb 16 19:42:19 crc kubenswrapper[4675]: I0216 19:42:19.919809 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Feb 16 19:42:19 crc kubenswrapper[4675]: I0216 19:42:19.921133 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Feb 16 19:42:19 crc kubenswrapper[4675]: I0216 19:42:19.921766 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Feb 16 19:42:19 crc kubenswrapper[4675]: I0216 19:42:19.922911 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Feb 16 19:42:19 crc kubenswrapper[4675]: I0216 19:42:19.923481 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Feb 16 19:42:19 crc kubenswrapper[4675]: I0216 19:42:19.924557 4675 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Feb 16 19:42:19 crc kubenswrapper[4675]: I0216 19:42:19.924708 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Feb 16 19:42:19 crc kubenswrapper[4675]: I0216 19:42:19.926845 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Feb 16 19:42:19 crc kubenswrapper[4675]: I0216 19:42:19.928196 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Feb 16 19:42:19 crc kubenswrapper[4675]: I0216 19:42:19.928756 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Feb 16 19:42:19 crc kubenswrapper[4675]: I0216 19:42:19.930939 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Feb 16 19:42:19 crc kubenswrapper[4675]: I0216 19:42:19.931916 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Feb 16 19:42:19 crc kubenswrapper[4675]: I0216 19:42:19.933266 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Feb 16 19:42:19 crc kubenswrapper[4675]: I0216 19:42:19.934164 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Feb 16 19:42:19 crc kubenswrapper[4675]: I0216 19:42:19.935417 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Feb 16 19:42:19 crc kubenswrapper[4675]: I0216 19:42:19.935982 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Feb 16 19:42:19 crc kubenswrapper[4675]: I0216 19:42:19.937094 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Feb 16 19:42:19 crc kubenswrapper[4675]: I0216 19:42:19.937833 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Feb 16 19:42:19 crc kubenswrapper[4675]: I0216 19:42:19.938990 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Feb 16 19:42:19 crc kubenswrapper[4675]: I0216 19:42:19.939610 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Feb 16 19:42:19 crc kubenswrapper[4675]: I0216 19:42:19.940872 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Feb 16 19:42:19 crc kubenswrapper[4675]: I0216 19:42:19.941556 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Feb 16 19:42:19 crc kubenswrapper[4675]: I0216 19:42:19.943515 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Feb 16 19:42:19 crc kubenswrapper[4675]: I0216 19:42:19.944129 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Feb 16 19:42:19 crc kubenswrapper[4675]: I0216 19:42:19.949856 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Feb 16 19:42:19 crc kubenswrapper[4675]: I0216 19:42:19.950473 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Feb 16 19:42:19 crc kubenswrapper[4675]: I0216 19:42:19.952120 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Feb 16 19:42:19 crc kubenswrapper[4675]: I0216 19:42:19.952819 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Feb 16 19:42:19 crc kubenswrapper[4675]: I0216 19:42:19.953336 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.025360 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"18c7d7a1f00611bdd6074314992cb6601ce0c043b831b6125010fe098c7e12ad"} Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.026936 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"10e2a1727464113d2cd88dfb8ba52bb1944b3dfe67a0d40f5b2e57fe483e5890"} Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.026998 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"83a4afeccd65f013b07c0f90cbb94f765f918aaaeeca716935f940deb85403d1"} Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.049851 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:20Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.074786 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:20Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.091890 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc6bac98-79c6-4970-ba8b-7996a4046f7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055f2d1052fed3e5e8d660243c3d0595e3168a670ea7a22ef51b199528ffe826\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beefddcc0b5e2e84c15063f467b9043ba9caea11fbc3f7fb8fc1ece3b8c8f75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dbd2e87a95c7964a1bafb765ec76d7f18e180cc0e47df9fde7b526b3717a107\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cc4f98152f284333d41071adaefada973e3396c3cdb2c6ee3c662f0043fe65b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c762d1facdba3325880ae6b1e2704c626ae67915808b527959795774e2a1ba62\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-16T19:42:02Z\\\",\\\"message\\\":\\\"W0216 19:42:01.228572 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0216 19:42:01.228958 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771270921 cert, and key in /tmp/serving-cert-1169484463/serving-signer.crt, /tmp/serving-cert-1169484463/serving-signer.key\\\\nI0216 19:42:01.683881 1 observer_polling.go:159] Starting file observer\\\\nW0216 19:42:01.687387 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0216 19:42:01.693158 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0216 19:42:01.694799 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1169484463/tls.crt::/tmp/serving-cert-1169484463/tls.key\\\\\\\"\\\\nF0216 19:42:02.048422 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://344f8bd7bee340cc040a2556a0e92c055ed7fbad886f85750b66d21ace766e8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:20Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.106812 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18c7d7a1f00611bdd6074314992cb6601ce0c043b831b6125010fe098c7e12ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:20Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.110480 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.122106 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:20Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.126917 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.140764 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:20Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.144866 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-stxk6"] Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.145279 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-stxk6" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.148031 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.148097 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.148139 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.162789 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:20Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.175992 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:20Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.190068 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10e2a1727464113d2cd88dfb8ba52bb1944b3dfe67a0d40f5b2e57fe483e5890\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83a4afeccd65f013b07c0f90cbb94f765f918aaaeeca716935f940deb85403d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:20Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.203863 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:20Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.216287 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-stxk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af599569-feb0-432a-9adf-1b72d1ac1a57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-stxk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:20Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.232315 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.233468 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc6bac98-79c6-4970-ba8b-7996a4046f7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055f2d1052fed3e5e8d660243c3d0595e3168a670ea7a22ef51b199528ffe826\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beefddcc0b5e2e84c15063f467b9043ba9caea11fbc3f7fb8fc1ece3b8c8f75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dbd2e87a95c7964a1bafb765ec76d7f18e180cc0e47df9fde7b526b3717a107\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cc4f98152f284333d41071adaefada973e3396c3cdb2c6ee3c662f0043fe65b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c762d1facdba3325880ae6b1e2704c626ae67915808b527959795774e2a1ba62\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-16T19:42:02Z\\\",\\\"message\\\":\\\"W0216 19:42:01.228572 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0216 19:42:01.228958 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771270921 cert, and key in /tmp/serving-cert-1169484463/serving-signer.crt, /tmp/serving-cert-1169484463/serving-signer.key\\\\nI0216 19:42:01.683881 1 observer_polling.go:159] Starting file observer\\\\nW0216 19:42:01.687387 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0216 19:42:01.693158 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0216 19:42:01.694799 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1169484463/tls.crt::/tmp/serving-cert-1169484463/tls.key\\\\\\\"\\\\nF0216 19:42:02.048422 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://344f8bd7bee340cc040a2556a0e92c055ed7fbad886f85750b66d21ace766e8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:20Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.253723 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18c7d7a1f00611bdd6074314992cb6601ce0c043b831b6125010fe098c7e12ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:20Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.267393 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.267681 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hp9fp\" (UniqueName: \"kubernetes.io/projected/af599569-feb0-432a-9adf-1b72d1ac1a57-kube-api-access-hp9fp\") pod \"node-resolver-stxk6\" (UID: \"af599569-feb0-432a-9adf-1b72d1ac1a57\") " pod="openshift-dns/node-resolver-stxk6" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.267831 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/af599569-feb0-432a-9adf-1b72d1ac1a57-hosts-file\") pod \"node-resolver-stxk6\" (UID: \"af599569-feb0-432a-9adf-1b72d1ac1a57\") " pod="openshift-dns/node-resolver-stxk6" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.267946 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 19:42:20 crc kubenswrapper[4675]: E0216 19:42:20.267549 4675 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 16 19:42:20 crc kubenswrapper[4675]: E0216 19:42:20.268215 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-16 19:42:22.268195825 +0000 UTC m=+25.393485381 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 16 19:42:20 crc kubenswrapper[4675]: E0216 19:42:20.268072 4675 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 16 19:42:20 crc kubenswrapper[4675]: E0216 19:42:20.268399 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-16 19:42:22.268388599 +0000 UTC m=+25.393678155 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.271253 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:20Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.287657 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:20Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.350870 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.368453 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.368588 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.368593 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 19:42:20 crc kubenswrapper[4675]: E0216 19:42:20.368719 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 19:42:22.368678238 +0000 UTC m=+25.493967794 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:42:20 crc kubenswrapper[4675]: E0216 19:42:20.368749 4675 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.368761 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hp9fp\" (UniqueName: \"kubernetes.io/projected/af599569-feb0-432a-9adf-1b72d1ac1a57-kube-api-access-hp9fp\") pod \"node-resolver-stxk6\" (UID: \"af599569-feb0-432a-9adf-1b72d1ac1a57\") " pod="openshift-dns/node-resolver-stxk6" Feb 16 19:42:20 crc kubenswrapper[4675]: E0216 19:42:20.368772 4675 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 16 19:42:20 crc kubenswrapper[4675]: E0216 19:42:20.368786 4675 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.368800 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 19:42:20 crc kubenswrapper[4675]: E0216 19:42:20.368833 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-16 19:42:22.368822332 +0000 UTC m=+25.494111888 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.368852 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/af599569-feb0-432a-9adf-1b72d1ac1a57-hosts-file\") pod \"node-resolver-stxk6\" (UID: \"af599569-feb0-432a-9adf-1b72d1ac1a57\") " pod="openshift-dns/node-resolver-stxk6" Feb 16 19:42:20 crc kubenswrapper[4675]: E0216 19:42:20.368885 4675 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 16 19:42:20 crc kubenswrapper[4675]: E0216 19:42:20.368901 4675 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 16 19:42:20 crc kubenswrapper[4675]: E0216 19:42:20.368913 4675 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.368916 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/af599569-feb0-432a-9adf-1b72d1ac1a57-hosts-file\") pod \"node-resolver-stxk6\" (UID: \"af599569-feb0-432a-9adf-1b72d1ac1a57\") " pod="openshift-dns/node-resolver-stxk6" Feb 16 19:42:20 crc kubenswrapper[4675]: E0216 19:42:20.368942 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-16 19:42:22.368933694 +0000 UTC m=+25.494223250 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.407376 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hp9fp\" (UniqueName: \"kubernetes.io/projected/af599569-feb0-432a-9adf-1b72d1ac1a57-kube-api-access-hp9fp\") pod \"node-resolver-stxk6\" (UID: \"af599569-feb0-432a-9adf-1b72d1ac1a57\") " pod="openshift-dns/node-resolver-stxk6" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.431312 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:20Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.432402 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.459020 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-stxk6" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.486296 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:20Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.563835 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-pj5xg"] Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.564211 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-pj5xg" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.570281 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-j7pnb"] Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.570612 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-gpc5z"] Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.571253 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.571612 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" Feb 16 19:42:20 crc kubenswrapper[4675]: W0216 19:42:20.572291 4675 reflector.go:561] object-"openshift-multus"/"cni-copy-resources": failed to list *v1.ConfigMap: configmaps "cni-copy-resources" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-multus": no relationship found between node 'crc' and this object Feb 16 19:42:20 crc kubenswrapper[4675]: E0216 19:42:20.572326 4675 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"cni-copy-resources\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"cni-copy-resources\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.572588 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-rg8bp"] Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.573198 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-rg8bp" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.579508 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.579475 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"358c3bab-4666-44d1-90fb-3affa22327e2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e0dcd3707a91b9a3869942408c144f0b4ca22cd64e99de5fe43f95094e44a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d3a4d507eb459edf9f2a3c58417ae320eba85a7b597808fdb0eabe7d7d5afbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd404f7efe7c09d47f0550b7cf66ac7193f315f60db74cd5609593546037c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bfda6b7519474256ab63750ba86955592a2e47a0e364e4170a19fe28270459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e9a715621e072260b3d9703159db5eaad4918ff7a2cb7689dcb914ccd617950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e8bfd47afcf3bbd1831b98e69a3ee5fe7b49dd04d4f2efc9d9dc8fda831c74a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e8bfd47afcf3bbd1831b98e69a3ee5fe7b49dd04d4f2efc9d9dc8fda831c74a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65752e9eeab5fea26d84c7d43a33947d3dbabdd31578480904db24d60d298a0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65752e9eeab5fea26d84c7d43a33947d3dbabdd31578480904db24d60d298a0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bd389e534c9b05c81db5cfb0a9f1f5e78e2f1ccbcce15953172836fb3fb6744f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd389e534c9b05c81db5cfb0a9f1f5e78e2f1ccbcce15953172836fb3fb6744f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:20Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.579682 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.579753 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.579952 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.584992 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.585360 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.585784 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.586159 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.586394 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.586616 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.586836 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.587018 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.587198 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.587380 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.588738 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.589044 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.589181 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.596722 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.619098 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:20Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.648109 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10e2a1727464113d2cd88dfb8ba52bb1944b3dfe67a0d40f5b2e57fe483e5890\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83a4afeccd65f013b07c0f90cbb94f765f918aaaeeca716935f940deb85403d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:20Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.666767 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:20Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.674239 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-ovnkube-config\") pod \"ovnkube-node-gpc5z\" (UID: \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.674322 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c9a99563-d631-455f-8464-160e5619c610-multus-daemon-config\") pod \"multus-pj5xg\" (UID: \"c9a99563-d631-455f-8464-160e5619c610\") " pod="openshift-multus/multus-pj5xg" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.674352 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/10414964-83d0-4d95-a89f-e3212a8015b5-rootfs\") pod \"machine-config-daemon-j7pnb\" (UID: \"10414964-83d0-4d95-a89f-e3212a8015b5\") " pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.674378 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c9a99563-d631-455f-8464-160e5619c610-host-var-lib-kubelet\") pod \"multus-pj5xg\" (UID: \"c9a99563-d631-455f-8464-160e5619c610\") " pod="openshift-multus/multus-pj5xg" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.674401 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-host-run-netns\") pod \"ovnkube-node-gpc5z\" (UID: \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.674426 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c9a99563-d631-455f-8464-160e5619c610-host-run-k8s-cni-cncf-io\") pod \"multus-pj5xg\" (UID: \"c9a99563-d631-455f-8464-160e5619c610\") " pod="openshift-multus/multus-pj5xg" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.674490 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c9a99563-d631-455f-8464-160e5619c610-host-var-lib-cni-bin\") pod \"multus-pj5xg\" (UID: \"c9a99563-d631-455f-8464-160e5619c610\") " pod="openshift-multus/multus-pj5xg" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.674539 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c9a99563-d631-455f-8464-160e5619c610-hostroot\") pod \"multus-pj5xg\" (UID: \"c9a99563-d631-455f-8464-160e5619c610\") " pod="openshift-multus/multus-pj5xg" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.674589 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f385df3d-0543-4189-88a6-2163c8b9b959-system-cni-dir\") pod \"multus-additional-cni-plugins-rg8bp\" (UID: \"f385df3d-0543-4189-88a6-2163c8b9b959\") " pod="openshift-multus/multus-additional-cni-plugins-rg8bp" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.674612 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27l4j\" (UniqueName: \"kubernetes.io/projected/10414964-83d0-4d95-a89f-e3212a8015b5-kube-api-access-27l4j\") pod \"machine-config-daemon-j7pnb\" (UID: \"10414964-83d0-4d95-a89f-e3212a8015b5\") " pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.674643 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f385df3d-0543-4189-88a6-2163c8b9b959-cnibin\") pod \"multus-additional-cni-plugins-rg8bp\" (UID: \"f385df3d-0543-4189-88a6-2163c8b9b959\") " pod="openshift-multus/multus-additional-cni-plugins-rg8bp" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.674669 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-var-lib-openvswitch\") pod \"ovnkube-node-gpc5z\" (UID: \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.674708 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-ovn-node-metrics-cert\") pod \"ovnkube-node-gpc5z\" (UID: \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.674726 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s94dl\" (UniqueName: \"kubernetes.io/projected/c9a99563-d631-455f-8464-160e5619c610-kube-api-access-s94dl\") pod \"multus-pj5xg\" (UID: \"c9a99563-d631-455f-8464-160e5619c610\") " pod="openshift-multus/multus-pj5xg" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.674767 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-gpc5z\" (UID: \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.674785 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c9a99563-d631-455f-8464-160e5619c610-multus-cni-dir\") pod \"multus-pj5xg\" (UID: \"c9a99563-d631-455f-8464-160e5619c610\") " pod="openshift-multus/multus-pj5xg" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.674798 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c9a99563-d631-455f-8464-160e5619c610-host-run-multus-certs\") pod \"multus-pj5xg\" (UID: \"c9a99563-d631-455f-8464-160e5619c610\") " pod="openshift-multus/multus-pj5xg" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.674815 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f385df3d-0543-4189-88a6-2163c8b9b959-os-release\") pod \"multus-additional-cni-plugins-rg8bp\" (UID: \"f385df3d-0543-4189-88a6-2163c8b9b959\") " pod="openshift-multus/multus-additional-cni-plugins-rg8bp" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.674831 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-node-log\") pod \"ovnkube-node-gpc5z\" (UID: \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.674849 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbc5j\" (UniqueName: \"kubernetes.io/projected/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-kube-api-access-wbc5j\") pod \"ovnkube-node-gpc5z\" (UID: \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.674870 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9f77s\" (UniqueName: \"kubernetes.io/projected/f385df3d-0543-4189-88a6-2163c8b9b959-kube-api-access-9f77s\") pod \"multus-additional-cni-plugins-rg8bp\" (UID: \"f385df3d-0543-4189-88a6-2163c8b9b959\") " pod="openshift-multus/multus-additional-cni-plugins-rg8bp" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.674888 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-log-socket\") pod \"ovnkube-node-gpc5z\" (UID: \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.674904 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-host-run-ovn-kubernetes\") pod \"ovnkube-node-gpc5z\" (UID: \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.674921 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-ovnkube-script-lib\") pod \"ovnkube-node-gpc5z\" (UID: \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.674961 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c9a99563-d631-455f-8464-160e5619c610-host-run-netns\") pod \"multus-pj5xg\" (UID: \"c9a99563-d631-455f-8464-160e5619c610\") " pod="openshift-multus/multus-pj5xg" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.675010 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c9a99563-d631-455f-8464-160e5619c610-host-var-lib-cni-multus\") pod \"multus-pj5xg\" (UID: \"c9a99563-d631-455f-8464-160e5619c610\") " pod="openshift-multus/multus-pj5xg" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.675062 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-host-cni-bin\") pod \"ovnkube-node-gpc5z\" (UID: \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.675088 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f385df3d-0543-4189-88a6-2163c8b9b959-cni-binary-copy\") pod \"multus-additional-cni-plugins-rg8bp\" (UID: \"f385df3d-0543-4189-88a6-2163c8b9b959\") " pod="openshift-multus/multus-additional-cni-plugins-rg8bp" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.675114 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-host-slash\") pod \"ovnkube-node-gpc5z\" (UID: \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.675152 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c9a99563-d631-455f-8464-160e5619c610-cni-binary-copy\") pod \"multus-pj5xg\" (UID: \"c9a99563-d631-455f-8464-160e5619c610\") " pod="openshift-multus/multus-pj5xg" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.675176 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c9a99563-d631-455f-8464-160e5619c610-multus-socket-dir-parent\") pod \"multus-pj5xg\" (UID: \"c9a99563-d631-455f-8464-160e5619c610\") " pod="openshift-multus/multus-pj5xg" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.675200 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f385df3d-0543-4189-88a6-2163c8b9b959-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-rg8bp\" (UID: \"f385df3d-0543-4189-88a6-2163c8b9b959\") " pod="openshift-multus/multus-additional-cni-plugins-rg8bp" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.675223 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-env-overrides\") pod \"ovnkube-node-gpc5z\" (UID: \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.675247 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c9a99563-d631-455f-8464-160e5619c610-os-release\") pod \"multus-pj5xg\" (UID: \"c9a99563-d631-455f-8464-160e5619c610\") " pod="openshift-multus/multus-pj5xg" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.675286 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-run-systemd\") pod \"ovnkube-node-gpc5z\" (UID: \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.675340 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/10414964-83d0-4d95-a89f-e3212a8015b5-proxy-tls\") pod \"machine-config-daemon-j7pnb\" (UID: \"10414964-83d0-4d95-a89f-e3212a8015b5\") " pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.675363 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/10414964-83d0-4d95-a89f-e3212a8015b5-mcd-auth-proxy-config\") pod \"machine-config-daemon-j7pnb\" (UID: \"10414964-83d0-4d95-a89f-e3212a8015b5\") " pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.675386 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-etc-openvswitch\") pod \"ovnkube-node-gpc5z\" (UID: \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.675406 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c9a99563-d631-455f-8464-160e5619c610-system-cni-dir\") pod \"multus-pj5xg\" (UID: \"c9a99563-d631-455f-8464-160e5619c610\") " pod="openshift-multus/multus-pj5xg" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.675427 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-systemd-units\") pod \"ovnkube-node-gpc5z\" (UID: \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.675450 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-run-ovn\") pod \"ovnkube-node-gpc5z\" (UID: \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.675469 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c9a99563-d631-455f-8464-160e5619c610-cnibin\") pod \"multus-pj5xg\" (UID: \"c9a99563-d631-455f-8464-160e5619c610\") " pod="openshift-multus/multus-pj5xg" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.675490 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f385df3d-0543-4189-88a6-2163c8b9b959-tuning-conf-dir\") pod \"multus-additional-cni-plugins-rg8bp\" (UID: \"f385df3d-0543-4189-88a6-2163c8b9b959\") " pod="openshift-multus/multus-additional-cni-plugins-rg8bp" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.675510 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c9a99563-d631-455f-8464-160e5619c610-etc-kubernetes\") pod \"multus-pj5xg\" (UID: \"c9a99563-d631-455f-8464-160e5619c610\") " pod="openshift-multus/multus-pj5xg" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.675544 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-host-kubelet\") pod \"ovnkube-node-gpc5z\" (UID: \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.675565 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-run-openvswitch\") pod \"ovnkube-node-gpc5z\" (UID: \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.675587 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-host-cni-netd\") pod \"ovnkube-node-gpc5z\" (UID: \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.675608 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c9a99563-d631-455f-8464-160e5619c610-multus-conf-dir\") pod \"multus-pj5xg\" (UID: \"c9a99563-d631-455f-8464-160e5619c610\") " pod="openshift-multus/multus-pj5xg" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.686050 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-stxk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af599569-feb0-432a-9adf-1b72d1ac1a57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-stxk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:20Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.717582 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc6bac98-79c6-4970-ba8b-7996a4046f7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055f2d1052fed3e5e8d660243c3d0595e3168a670ea7a22ef51b199528ffe826\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beefddcc0b5e2e84c15063f467b9043ba9caea11fbc3f7fb8fc1ece3b8c8f75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dbd2e87a95c7964a1bafb765ec76d7f18e180cc0e47df9fde7b526b3717a107\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cc4f98152f284333d41071adaefada973e3396c3cdb2c6ee3c662f0043fe65b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c762d1facdba3325880ae6b1e2704c626ae67915808b527959795774e2a1ba62\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-16T19:42:02Z\\\",\\\"message\\\":\\\"W0216 19:42:01.228572 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0216 19:42:01.228958 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771270921 cert, and key in /tmp/serving-cert-1169484463/serving-signer.crt, /tmp/serving-cert-1169484463/serving-signer.key\\\\nI0216 19:42:01.683881 1 observer_polling.go:159] Starting file observer\\\\nW0216 19:42:01.687387 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0216 19:42:01.693158 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0216 19:42:01.694799 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1169484463/tls.crt::/tmp/serving-cert-1169484463/tls.key\\\\\\\"\\\\nF0216 19:42:02.048422 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://344f8bd7bee340cc040a2556a0e92c055ed7fbad886f85750b66d21ace766e8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:20Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.742035 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18c7d7a1f00611bdd6074314992cb6601ce0c043b831b6125010fe098c7e12ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:20Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.779155 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f385df3d-0543-4189-88a6-2163c8b9b959-system-cni-dir\") pod \"multus-additional-cni-plugins-rg8bp\" (UID: \"f385df3d-0543-4189-88a6-2163c8b9b959\") " pod="openshift-multus/multus-additional-cni-plugins-rg8bp" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.779194 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27l4j\" (UniqueName: \"kubernetes.io/projected/10414964-83d0-4d95-a89f-e3212a8015b5-kube-api-access-27l4j\") pod \"machine-config-daemon-j7pnb\" (UID: \"10414964-83d0-4d95-a89f-e3212a8015b5\") " pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.779217 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f385df3d-0543-4189-88a6-2163c8b9b959-cnibin\") pod \"multus-additional-cni-plugins-rg8bp\" (UID: \"f385df3d-0543-4189-88a6-2163c8b9b959\") " pod="openshift-multus/multus-additional-cni-plugins-rg8bp" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.779235 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-var-lib-openvswitch\") pod \"ovnkube-node-gpc5z\" (UID: \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.779251 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-ovn-node-metrics-cert\") pod \"ovnkube-node-gpc5z\" (UID: \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.779270 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-gpc5z\" (UID: \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.779293 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c9a99563-d631-455f-8464-160e5619c610-multus-cni-dir\") pod \"multus-pj5xg\" (UID: \"c9a99563-d631-455f-8464-160e5619c610\") " pod="openshift-multus/multus-pj5xg" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.779310 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c9a99563-d631-455f-8464-160e5619c610-host-run-multus-certs\") pod \"multus-pj5xg\" (UID: \"c9a99563-d631-455f-8464-160e5619c610\") " pod="openshift-multus/multus-pj5xg" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.779320 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f385df3d-0543-4189-88a6-2163c8b9b959-cnibin\") pod \"multus-additional-cni-plugins-rg8bp\" (UID: \"f385df3d-0543-4189-88a6-2163c8b9b959\") " pod="openshift-multus/multus-additional-cni-plugins-rg8bp" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.779333 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f385df3d-0543-4189-88a6-2163c8b9b959-system-cni-dir\") pod \"multus-additional-cni-plugins-rg8bp\" (UID: \"f385df3d-0543-4189-88a6-2163c8b9b959\") " pod="openshift-multus/multus-additional-cni-plugins-rg8bp" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.779327 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s94dl\" (UniqueName: \"kubernetes.io/projected/c9a99563-d631-455f-8464-160e5619c610-kube-api-access-s94dl\") pod \"multus-pj5xg\" (UID: \"c9a99563-d631-455f-8464-160e5619c610\") " pod="openshift-multus/multus-pj5xg" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.779439 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-node-log\") pod \"ovnkube-node-gpc5z\" (UID: \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.779466 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbc5j\" (UniqueName: \"kubernetes.io/projected/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-kube-api-access-wbc5j\") pod \"ovnkube-node-gpc5z\" (UID: \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.779489 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f385df3d-0543-4189-88a6-2163c8b9b959-os-release\") pod \"multus-additional-cni-plugins-rg8bp\" (UID: \"f385df3d-0543-4189-88a6-2163c8b9b959\") " pod="openshift-multus/multus-additional-cni-plugins-rg8bp" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.779505 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-log-socket\") pod \"ovnkube-node-gpc5z\" (UID: \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.779523 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-node-log\") pod \"ovnkube-node-gpc5z\" (UID: \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.779525 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-host-run-ovn-kubernetes\") pod \"ovnkube-node-gpc5z\" (UID: \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.779558 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-host-run-ovn-kubernetes\") pod \"ovnkube-node-gpc5z\" (UID: \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.779567 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-ovnkube-script-lib\") pod \"ovnkube-node-gpc5z\" (UID: \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.779598 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c9a99563-d631-455f-8464-160e5619c610-host-run-netns\") pod \"multus-pj5xg\" (UID: \"c9a99563-d631-455f-8464-160e5619c610\") " pod="openshift-multus/multus-pj5xg" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.779628 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c9a99563-d631-455f-8464-160e5619c610-host-var-lib-cni-multus\") pod \"multus-pj5xg\" (UID: \"c9a99563-d631-455f-8464-160e5619c610\") " pod="openshift-multus/multus-pj5xg" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.779647 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-var-lib-openvswitch\") pod \"ovnkube-node-gpc5z\" (UID: \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.779657 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9f77s\" (UniqueName: \"kubernetes.io/projected/f385df3d-0543-4189-88a6-2163c8b9b959-kube-api-access-9f77s\") pod \"multus-additional-cni-plugins-rg8bp\" (UID: \"f385df3d-0543-4189-88a6-2163c8b9b959\") " pod="openshift-multus/multus-additional-cni-plugins-rg8bp" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.779707 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-host-cni-bin\") pod \"ovnkube-node-gpc5z\" (UID: \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.779745 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-host-slash\") pod \"ovnkube-node-gpc5z\" (UID: \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.779767 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c9a99563-d631-455f-8464-160e5619c610-cni-binary-copy\") pod \"multus-pj5xg\" (UID: \"c9a99563-d631-455f-8464-160e5619c610\") " pod="openshift-multus/multus-pj5xg" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.779792 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c9a99563-d631-455f-8464-160e5619c610-multus-socket-dir-parent\") pod \"multus-pj5xg\" (UID: \"c9a99563-d631-455f-8464-160e5619c610\") " pod="openshift-multus/multus-pj5xg" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.779818 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f385df3d-0543-4189-88a6-2163c8b9b959-cni-binary-copy\") pod \"multus-additional-cni-plugins-rg8bp\" (UID: \"f385df3d-0543-4189-88a6-2163c8b9b959\") " pod="openshift-multus/multus-additional-cni-plugins-rg8bp" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.779842 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-env-overrides\") pod \"ovnkube-node-gpc5z\" (UID: \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.779865 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c9a99563-d631-455f-8464-160e5619c610-os-release\") pod \"multus-pj5xg\" (UID: \"c9a99563-d631-455f-8464-160e5619c610\") " pod="openshift-multus/multus-pj5xg" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.779902 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f385df3d-0543-4189-88a6-2163c8b9b959-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-rg8bp\" (UID: \"f385df3d-0543-4189-88a6-2163c8b9b959\") " pod="openshift-multus/multus-additional-cni-plugins-rg8bp" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.779938 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/10414964-83d0-4d95-a89f-e3212a8015b5-proxy-tls\") pod \"machine-config-daemon-j7pnb\" (UID: \"10414964-83d0-4d95-a89f-e3212a8015b5\") " pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.779963 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/10414964-83d0-4d95-a89f-e3212a8015b5-mcd-auth-proxy-config\") pod \"machine-config-daemon-j7pnb\" (UID: \"10414964-83d0-4d95-a89f-e3212a8015b5\") " pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.779986 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-run-systemd\") pod \"ovnkube-node-gpc5z\" (UID: \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.779989 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f385df3d-0543-4189-88a6-2163c8b9b959-os-release\") pod \"multus-additional-cni-plugins-rg8bp\" (UID: \"f385df3d-0543-4189-88a6-2163c8b9b959\") " pod="openshift-multus/multus-additional-cni-plugins-rg8bp" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.780007 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-etc-openvswitch\") pod \"ovnkube-node-gpc5z\" (UID: \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.780041 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c9a99563-d631-455f-8464-160e5619c610-system-cni-dir\") pod \"multus-pj5xg\" (UID: \"c9a99563-d631-455f-8464-160e5619c610\") " pod="openshift-multus/multus-pj5xg" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.780064 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c9a99563-d631-455f-8464-160e5619c610-cnibin\") pod \"multus-pj5xg\" (UID: \"c9a99563-d631-455f-8464-160e5619c610\") " pod="openshift-multus/multus-pj5xg" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.780087 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-systemd-units\") pod \"ovnkube-node-gpc5z\" (UID: \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.780109 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-run-ovn\") pod \"ovnkube-node-gpc5z\" (UID: \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.780127 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c9a99563-d631-455f-8464-160e5619c610-etc-kubernetes\") pod \"multus-pj5xg\" (UID: \"c9a99563-d631-455f-8464-160e5619c610\") " pod="openshift-multus/multus-pj5xg" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.780142 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c9a99563-d631-455f-8464-160e5619c610-multus-cni-dir\") pod \"multus-pj5xg\" (UID: \"c9a99563-d631-455f-8464-160e5619c610\") " pod="openshift-multus/multus-pj5xg" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.780149 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f385df3d-0543-4189-88a6-2163c8b9b959-tuning-conf-dir\") pod \"multus-additional-cni-plugins-rg8bp\" (UID: \"f385df3d-0543-4189-88a6-2163c8b9b959\") " pod="openshift-multus/multus-additional-cni-plugins-rg8bp" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.780180 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-run-openvswitch\") pod \"ovnkube-node-gpc5z\" (UID: \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.780199 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-host-cni-netd\") pod \"ovnkube-node-gpc5z\" (UID: \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.780217 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c9a99563-d631-455f-8464-160e5619c610-multus-conf-dir\") pod \"multus-pj5xg\" (UID: \"c9a99563-d631-455f-8464-160e5619c610\") " pod="openshift-multus/multus-pj5xg" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.780277 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-host-kubelet\") pod \"ovnkube-node-gpc5z\" (UID: \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.780294 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-ovnkube-config\") pod \"ovnkube-node-gpc5z\" (UID: \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.780343 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c9a99563-d631-455f-8464-160e5619c610-multus-daemon-config\") pod \"multus-pj5xg\" (UID: \"c9a99563-d631-455f-8464-160e5619c610\") " pod="openshift-multus/multus-pj5xg" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.780368 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/10414964-83d0-4d95-a89f-e3212a8015b5-rootfs\") pod \"machine-config-daemon-j7pnb\" (UID: \"10414964-83d0-4d95-a89f-e3212a8015b5\") " pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.780385 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c9a99563-d631-455f-8464-160e5619c610-host-var-lib-kubelet\") pod \"multus-pj5xg\" (UID: \"c9a99563-d631-455f-8464-160e5619c610\") " pod="openshift-multus/multus-pj5xg" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.780405 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-host-run-netns\") pod \"ovnkube-node-gpc5z\" (UID: \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.780427 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c9a99563-d631-455f-8464-160e5619c610-host-run-k8s-cni-cncf-io\") pod \"multus-pj5xg\" (UID: \"c9a99563-d631-455f-8464-160e5619c610\") " pod="openshift-multus/multus-pj5xg" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.780443 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c9a99563-d631-455f-8464-160e5619c610-host-var-lib-cni-bin\") pod \"multus-pj5xg\" (UID: \"c9a99563-d631-455f-8464-160e5619c610\") " pod="openshift-multus/multus-pj5xg" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.780464 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c9a99563-d631-455f-8464-160e5619c610-hostroot\") pod \"multus-pj5xg\" (UID: \"c9a99563-d631-455f-8464-160e5619c610\") " pod="openshift-multus/multus-pj5xg" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.780543 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c9a99563-d631-455f-8464-160e5619c610-hostroot\") pod \"multus-pj5xg\" (UID: \"c9a99563-d631-455f-8464-160e5619c610\") " pod="openshift-multus/multus-pj5xg" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.780568 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-gpc5z\" (UID: \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.780596 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-run-openvswitch\") pod \"ovnkube-node-gpc5z\" (UID: \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.780624 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-host-cni-netd\") pod \"ovnkube-node-gpc5z\" (UID: \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.780653 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c9a99563-d631-455f-8464-160e5619c610-multus-conf-dir\") pod \"multus-pj5xg\" (UID: \"c9a99563-d631-455f-8464-160e5619c610\") " pod="openshift-multus/multus-pj5xg" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.780709 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-host-kubelet\") pod \"ovnkube-node-gpc5z\" (UID: \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.780756 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f385df3d-0543-4189-88a6-2163c8b9b959-tuning-conf-dir\") pod \"multus-additional-cni-plugins-rg8bp\" (UID: \"f385df3d-0543-4189-88a6-2163c8b9b959\") " pod="openshift-multus/multus-additional-cni-plugins-rg8bp" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.781522 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-ovnkube-config\") pod \"ovnkube-node-gpc5z\" (UID: \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.781550 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-ovnkube-script-lib\") pod \"ovnkube-node-gpc5z\" (UID: \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.781603 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c9a99563-d631-455f-8464-160e5619c610-host-run-netns\") pod \"multus-pj5xg\" (UID: \"c9a99563-d631-455f-8464-160e5619c610\") " pod="openshift-multus/multus-pj5xg" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.781640 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c9a99563-d631-455f-8464-160e5619c610-host-var-lib-cni-multus\") pod \"multus-pj5xg\" (UID: \"c9a99563-d631-455f-8464-160e5619c610\") " pod="openshift-multus/multus-pj5xg" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.781775 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c9a99563-d631-455f-8464-160e5619c610-cnibin\") pod \"multus-pj5xg\" (UID: \"c9a99563-d631-455f-8464-160e5619c610\") " pod="openshift-multus/multus-pj5xg" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.781780 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-host-run-netns\") pod \"ovnkube-node-gpc5z\" (UID: \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.781852 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-run-systemd\") pod \"ovnkube-node-gpc5z\" (UID: \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.781896 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-etc-openvswitch\") pod \"ovnkube-node-gpc5z\" (UID: \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.781944 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c9a99563-d631-455f-8464-160e5619c610-system-cni-dir\") pod \"multus-pj5xg\" (UID: \"c9a99563-d631-455f-8464-160e5619c610\") " pod="openshift-multus/multus-pj5xg" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.781989 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c9a99563-d631-455f-8464-160e5619c610-host-var-lib-kubelet\") pod \"multus-pj5xg\" (UID: \"c9a99563-d631-455f-8464-160e5619c610\") " pod="openshift-multus/multus-pj5xg" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.782023 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-systemd-units\") pod \"ovnkube-node-gpc5z\" (UID: \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.780065 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-log-socket\") pod \"ovnkube-node-gpc5z\" (UID: \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.782113 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c9a99563-d631-455f-8464-160e5619c610-host-run-k8s-cni-cncf-io\") pod \"multus-pj5xg\" (UID: \"c9a99563-d631-455f-8464-160e5619c610\") " pod="openshift-multus/multus-pj5xg" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.782152 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c9a99563-d631-455f-8464-160e5619c610-multus-daemon-config\") pod \"multus-pj5xg\" (UID: \"c9a99563-d631-455f-8464-160e5619c610\") " pod="openshift-multus/multus-pj5xg" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.782165 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c9a99563-d631-455f-8464-160e5619c610-multus-socket-dir-parent\") pod \"multus-pj5xg\" (UID: \"c9a99563-d631-455f-8464-160e5619c610\") " pod="openshift-multus/multus-pj5xg" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.782267 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/10414964-83d0-4d95-a89f-e3212a8015b5-rootfs\") pod \"machine-config-daemon-j7pnb\" (UID: \"10414964-83d0-4d95-a89f-e3212a8015b5\") " pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.782351 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/10414964-83d0-4d95-a89f-e3212a8015b5-mcd-auth-proxy-config\") pod \"machine-config-daemon-j7pnb\" (UID: \"10414964-83d0-4d95-a89f-e3212a8015b5\") " pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.782451 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c9a99563-d631-455f-8464-160e5619c610-host-var-lib-cni-bin\") pod \"multus-pj5xg\" (UID: \"c9a99563-d631-455f-8464-160e5619c610\") " pod="openshift-multus/multus-pj5xg" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.782535 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-run-ovn\") pod \"ovnkube-node-gpc5z\" (UID: \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.782670 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c9a99563-d631-455f-8464-160e5619c610-os-release\") pod \"multus-pj5xg\" (UID: \"c9a99563-d631-455f-8464-160e5619c610\") " pod="openshift-multus/multus-pj5xg" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.782768 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c9a99563-d631-455f-8464-160e5619c610-etc-kubernetes\") pod \"multus-pj5xg\" (UID: \"c9a99563-d631-455f-8464-160e5619c610\") " pod="openshift-multus/multus-pj5xg" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.782792 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-env-overrides\") pod \"ovnkube-node-gpc5z\" (UID: \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.783041 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-host-slash\") pod \"ovnkube-node-gpc5z\" (UID: \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.783087 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-host-cni-bin\") pod \"ovnkube-node-gpc5z\" (UID: \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.783479 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f385df3d-0543-4189-88a6-2163c8b9b959-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-rg8bp\" (UID: \"f385df3d-0543-4189-88a6-2163c8b9b959\") " pod="openshift-multus/multus-additional-cni-plugins-rg8bp" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.780395 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c9a99563-d631-455f-8464-160e5619c610-host-run-multus-certs\") pod \"multus-pj5xg\" (UID: \"c9a99563-d631-455f-8464-160e5619c610\") " pod="openshift-multus/multus-pj5xg" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.794755 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-ovn-node-metrics-cert\") pod \"ovnkube-node-gpc5z\" (UID: \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.795155 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"358c3bab-4666-44d1-90fb-3affa22327e2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e0dcd3707a91b9a3869942408c144f0b4ca22cd64e99de5fe43f95094e44a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d3a4d507eb459edf9f2a3c58417ae320eba85a7b597808fdb0eabe7d7d5afbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd404f7efe7c09d47f0550b7cf66ac7193f315f60db74cd5609593546037c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bfda6b7519474256ab63750ba86955592a2e47a0e364e4170a19fe28270459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e9a715621e072260b3d9703159db5eaad4918ff7a2cb7689dcb914ccd617950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e8bfd47afcf3bbd1831b98e69a3ee5fe7b49dd04d4f2efc9d9dc8fda831c74a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e8bfd47afcf3bbd1831b98e69a3ee5fe7b49dd04d4f2efc9d9dc8fda831c74a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65752e9eeab5fea26d84c7d43a33947d3dbabdd31578480904db24d60d298a0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65752e9eeab5fea26d84c7d43a33947d3dbabdd31578480904db24d60d298a0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bd389e534c9b05c81db5cfb0a9f1f5e78e2f1ccbcce15953172836fb3fb6744f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd389e534c9b05c81db5cfb0a9f1f5e78e2f1ccbcce15953172836fb3fb6744f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:20Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.798442 4675 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 11:47:44.728346769 +0000 UTC Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.804186 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/10414964-83d0-4d95-a89f-e3212a8015b5-proxy-tls\") pod \"machine-config-daemon-j7pnb\" (UID: \"10414964-83d0-4d95-a89f-e3212a8015b5\") " pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.807941 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbc5j\" (UniqueName: \"kubernetes.io/projected/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-kube-api-access-wbc5j\") pod \"ovnkube-node-gpc5z\" (UID: \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\") " pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.808006 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9f77s\" (UniqueName: \"kubernetes.io/projected/f385df3d-0543-4189-88a6-2163c8b9b959-kube-api-access-9f77s\") pod \"multus-additional-cni-plugins-rg8bp\" (UID: \"f385df3d-0543-4189-88a6-2163c8b9b959\") " pod="openshift-multus/multus-additional-cni-plugins-rg8bp" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.816601 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27l4j\" (UniqueName: \"kubernetes.io/projected/10414964-83d0-4d95-a89f-e3212a8015b5-kube-api-access-27l4j\") pod \"machine-config-daemon-j7pnb\" (UID: \"10414964-83d0-4d95-a89f-e3212a8015b5\") " pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.819921 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s94dl\" (UniqueName: \"kubernetes.io/projected/c9a99563-d631-455f-8464-160e5619c610-kube-api-access-s94dl\") pod \"multus-pj5xg\" (UID: \"c9a99563-d631-455f-8464-160e5619c610\") " pod="openshift-multus/multus-pj5xg" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.825244 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:20Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.837839 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10414964-83d0-4d95-a89f-e3212a8015b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27l4j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27l4j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j7pnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:20Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.851606 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc6bac98-79c6-4970-ba8b-7996a4046f7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055f2d1052fed3e5e8d660243c3d0595e3168a670ea7a22ef51b199528ffe826\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beefddcc0b5e2e84c15063f467b9043ba9caea11fbc3f7fb8fc1ece3b8c8f75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dbd2e87a95c7964a1bafb765ec76d7f18e180cc0e47df9fde7b526b3717a107\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cc4f98152f284333d41071adaefada973e3396c3cdb2c6ee3c662f0043fe65b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c762d1facdba3325880ae6b1e2704c626ae67915808b527959795774e2a1ba62\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-16T19:42:02Z\\\",\\\"message\\\":\\\"W0216 19:42:01.228572 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0216 19:42:01.228958 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771270921 cert, and key in /tmp/serving-cert-1169484463/serving-signer.crt, /tmp/serving-cert-1169484463/serving-signer.key\\\\nI0216 19:42:01.683881 1 observer_polling.go:159] Starting file observer\\\\nW0216 19:42:01.687387 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0216 19:42:01.693158 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0216 19:42:01.694799 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1169484463/tls.crt::/tmp/serving-cert-1169484463/tls.key\\\\\\\"\\\\nF0216 19:42:02.048422 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://344f8bd7bee340cc040a2556a0e92c055ed7fbad886f85750b66d21ace766e8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:20Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.883902 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.883957 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.883928 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 19:42:20 crc kubenswrapper[4675]: E0216 19:42:20.884094 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 19:42:20 crc kubenswrapper[4675]: E0216 19:42:20.884285 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 19:42:20 crc kubenswrapper[4675]: E0216 19:42:20.884387 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.900032 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.913186 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18c7d7a1f00611bdd6074314992cb6601ce0c043b831b6125010fe098c7e12ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:20Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.917408 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" Feb 16 19:42:20 crc kubenswrapper[4675]: W0216 19:42:20.918011 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b6e2d5a_0472_425b_b5b4_0b94f14ebfba.slice/crio-9e64992c36b7a5261c136d3803e96d18254b5229a957b52f26cfbb1709838f65 WatchSource:0}: Error finding container 9e64992c36b7a5261c136d3803e96d18254b5229a957b52f26cfbb1709838f65: Status 404 returned error can't find the container with id 9e64992c36b7a5261c136d3803e96d18254b5229a957b52f26cfbb1709838f65 Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.931200 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:20Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:20 crc kubenswrapper[4675]: W0216 19:42:20.932073 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10414964_83d0_4d95_a89f_e3212a8015b5.slice/crio-ab25aaf02d83f0530748ef442afb8db620b50b5f74666e24bce48a2739193c4f WatchSource:0}: Error finding container ab25aaf02d83f0530748ef442afb8db620b50b5f74666e24bce48a2739193c4f: Status 404 returned error can't find the container with id ab25aaf02d83f0530748ef442afb8db620b50b5f74666e24bce48a2739193c4f Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.970335 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:20Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:20 crc kubenswrapper[4675]: I0216 19:42:20.988026 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pj5xg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9a99563-d631-455f-8464-160e5619c610\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s94dl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pj5xg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:20Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:21 crc kubenswrapper[4675]: I0216 19:42:21.039152 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" event={"ID":"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba","Type":"ContainerStarted","Data":"9e64992c36b7a5261c136d3803e96d18254b5229a957b52f26cfbb1709838f65"} Feb 16 19:42:21 crc kubenswrapper[4675]: I0216 19:42:21.043595 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" event={"ID":"10414964-83d0-4d95-a89f-e3212a8015b5","Type":"ContainerStarted","Data":"ab25aaf02d83f0530748ef442afb8db620b50b5f74666e24bce48a2739193c4f"} Feb 16 19:42:21 crc kubenswrapper[4675]: I0216 19:42:21.044811 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-stxk6" event={"ID":"af599569-feb0-432a-9adf-1b72d1ac1a57","Type":"ContainerStarted","Data":"c5944f34bb922394197cfb79978d19139f70724115954043828e9f609ca32945"} Feb 16 19:42:21 crc kubenswrapper[4675]: I0216 19:42:21.044863 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-stxk6" event={"ID":"af599569-feb0-432a-9adf-1b72d1ac1a57","Type":"ContainerStarted","Data":"35c36bd5c614817217693238c66db0efa3e3e1394387bbfcd3e246c954d8c2d5"} Feb 16 19:42:21 crc kubenswrapper[4675]: I0216 19:42:21.050777 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gpc5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:21Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:21 crc kubenswrapper[4675]: E0216 19:42:21.053950 4675 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 16 19:42:21 crc kubenswrapper[4675]: E0216 19:42:21.063397 4675 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-crc\" already exists" pod="openshift-etcd/etcd-crc" Feb 16 19:42:21 crc kubenswrapper[4675]: I0216 19:42:21.070990 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rg8bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f385df3d-0543-4189-88a6-2163c8b9b959\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rg8bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:21Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:21 crc kubenswrapper[4675]: I0216 19:42:21.103655 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"776ad1bc-395c-41c3-8c95-37a58a8cd4e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04dbf4934ecd5c1773fe7c1d032d68d59f41556af022bdf1c36f8de85221616a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abb4dfa515be84263cef6de8c300018193409908292cbfe0d19610ade9532ad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8d3160c906648c15ea910b1b2fa9223c716bbefac62b25bba4f83f3b50217b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd0dd7a8d986a0dd78568e33afa36fcd8bba88cf885d29b22484132bf0034c47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:21Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:21 crc kubenswrapper[4675]: I0216 19:42:21.118148 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:21Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:21 crc kubenswrapper[4675]: I0216 19:42:21.135458 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10e2a1727464113d2cd88dfb8ba52bb1944b3dfe67a0d40f5b2e57fe483e5890\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83a4afeccd65f013b07c0f90cbb94f765f918aaaeeca716935f940deb85403d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:21Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:21 crc kubenswrapper[4675]: I0216 19:42:21.157898 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-stxk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af599569-feb0-432a-9adf-1b72d1ac1a57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-stxk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:21Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:21 crc kubenswrapper[4675]: I0216 19:42:21.220946 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10414964-83d0-4d95-a89f-e3212a8015b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27l4j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27l4j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j7pnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:21Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:21 crc kubenswrapper[4675]: I0216 19:42:21.272373 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc6bac98-79c6-4970-ba8b-7996a4046f7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055f2d1052fed3e5e8d660243c3d0595e3168a670ea7a22ef51b199528ffe826\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beefddcc0b5e2e84c15063f467b9043ba9caea11fbc3f7fb8fc1ece3b8c8f75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dbd2e87a95c7964a1bafb765ec76d7f18e180cc0e47df9fde7b526b3717a107\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cc4f98152f284333d41071adaefada973e3396c3cdb2c6ee3c662f0043fe65b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c762d1facdba3325880ae6b1e2704c626ae67915808b527959795774e2a1ba62\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-16T19:42:02Z\\\",\\\"message\\\":\\\"W0216 19:42:01.228572 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0216 19:42:01.228958 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771270921 cert, and key in /tmp/serving-cert-1169484463/serving-signer.crt, /tmp/serving-cert-1169484463/serving-signer.key\\\\nI0216 19:42:01.683881 1 observer_polling.go:159] Starting file observer\\\\nW0216 19:42:01.687387 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0216 19:42:01.693158 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0216 19:42:01.694799 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1169484463/tls.crt::/tmp/serving-cert-1169484463/tls.key\\\\\\\"\\\\nF0216 19:42:02.048422 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://344f8bd7bee340cc040a2556a0e92c055ed7fbad886f85750b66d21ace766e8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:21Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:21 crc kubenswrapper[4675]: I0216 19:42:21.297019 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18c7d7a1f00611bdd6074314992cb6601ce0c043b831b6125010fe098c7e12ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:21Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:21 crc kubenswrapper[4675]: I0216 19:42:21.315947 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:21Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:21 crc kubenswrapper[4675]: I0216 19:42:21.334138 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:21Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:21 crc kubenswrapper[4675]: I0216 19:42:21.347800 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pj5xg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9a99563-d631-455f-8464-160e5619c610\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s94dl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pj5xg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:21Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:21 crc kubenswrapper[4675]: I0216 19:42:21.367157 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gpc5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:21Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:21 crc kubenswrapper[4675]: I0216 19:42:21.385717 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rg8bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f385df3d-0543-4189-88a6-2163c8b9b959\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rg8bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:21Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:21 crc kubenswrapper[4675]: I0216 19:42:21.400071 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"776ad1bc-395c-41c3-8c95-37a58a8cd4e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04dbf4934ecd5c1773fe7c1d032d68d59f41556af022bdf1c36f8de85221616a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abb4dfa515be84263cef6de8c300018193409908292cbfe0d19610ade9532ad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8d3160c906648c15ea910b1b2fa9223c716bbefac62b25bba4f83f3b50217b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd0dd7a8d986a0dd78568e33afa36fcd8bba88cf885d29b22484132bf0034c47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:21Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:21 crc kubenswrapper[4675]: I0216 19:42:21.414607 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:21Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:21 crc kubenswrapper[4675]: I0216 19:42:21.432619 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10e2a1727464113d2cd88dfb8ba52bb1944b3dfe67a0d40f5b2e57fe483e5890\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83a4afeccd65f013b07c0f90cbb94f765f918aaaeeca716935f940deb85403d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:21Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:21 crc kubenswrapper[4675]: I0216 19:42:21.443579 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-stxk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af599569-feb0-432a-9adf-1b72d1ac1a57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5944f34bb922394197cfb79978d19139f70724115954043828e9f609ca32945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-stxk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:21Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:21 crc kubenswrapper[4675]: I0216 19:42:21.466778 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"358c3bab-4666-44d1-90fb-3affa22327e2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e0dcd3707a91b9a3869942408c144f0b4ca22cd64e99de5fe43f95094e44a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d3a4d507eb459edf9f2a3c58417ae320eba85a7b597808fdb0eabe7d7d5afbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd404f7efe7c09d47f0550b7cf66ac7193f315f60db74cd5609593546037c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bfda6b7519474256ab63750ba86955592a2e47a0e364e4170a19fe28270459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e9a715621e072260b3d9703159db5eaad4918ff7a2cb7689dcb914ccd617950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e8bfd47afcf3bbd1831b98e69a3ee5fe7b49dd04d4f2efc9d9dc8fda831c74a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e8bfd47afcf3bbd1831b98e69a3ee5fe7b49dd04d4f2efc9d9dc8fda831c74a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65752e9eeab5fea26d84c7d43a33947d3dbabdd31578480904db24d60d298a0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65752e9eeab5fea26d84c7d43a33947d3dbabdd31578480904db24d60d298a0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bd389e534c9b05c81db5cfb0a9f1f5e78e2f1ccbcce15953172836fb3fb6744f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd389e534c9b05c81db5cfb0a9f1f5e78e2f1ccbcce15953172836fb3fb6744f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:21Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:21 crc kubenswrapper[4675]: I0216 19:42:21.488188 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:21Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:21 crc kubenswrapper[4675]: E0216 19:42:21.783649 4675 configmap.go:193] Couldn't get configMap openshift-multus/cni-copy-resources: failed to sync configmap cache: timed out waiting for the condition Feb 16 19:42:21 crc kubenswrapper[4675]: E0216 19:42:21.783703 4675 configmap.go:193] Couldn't get configMap openshift-multus/cni-copy-resources: failed to sync configmap cache: timed out waiting for the condition Feb 16 19:42:21 crc kubenswrapper[4675]: E0216 19:42:21.783785 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c9a99563-d631-455f-8464-160e5619c610-cni-binary-copy podName:c9a99563-d631-455f-8464-160e5619c610 nodeName:}" failed. No retries permitted until 2026-02-16 19:42:22.283754786 +0000 UTC m=+25.409044342 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cni-binary-copy" (UniqueName: "kubernetes.io/configmap/c9a99563-d631-455f-8464-160e5619c610-cni-binary-copy") pod "multus-pj5xg" (UID: "c9a99563-d631-455f-8464-160e5619c610") : failed to sync configmap cache: timed out waiting for the condition Feb 16 19:42:21 crc kubenswrapper[4675]: E0216 19:42:21.783808 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f385df3d-0543-4189-88a6-2163c8b9b959-cni-binary-copy podName:f385df3d-0543-4189-88a6-2163c8b9b959 nodeName:}" failed. No retries permitted until 2026-02-16 19:42:22.283799187 +0000 UTC m=+25.409088743 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cni-binary-copy" (UniqueName: "kubernetes.io/configmap/f385df3d-0543-4189-88a6-2163c8b9b959-cni-binary-copy") pod "multus-additional-cni-plugins-rg8bp" (UID: "f385df3d-0543-4189-88a6-2163c8b9b959") : failed to sync configmap cache: timed out waiting for the condition Feb 16 19:42:21 crc kubenswrapper[4675]: I0216 19:42:21.798624 4675 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 13:56:14.79888454 +0000 UTC Feb 16 19:42:22 crc kubenswrapper[4675]: I0216 19:42:22.023880 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 16 19:42:22 crc kubenswrapper[4675]: I0216 19:42:22.049295 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"04c59a1a3ae53156b955a90113bc3d3b4424b7dcc3baa91e8256fd0893751af8"} Feb 16 19:42:22 crc kubenswrapper[4675]: I0216 19:42:22.050617 4675 generic.go:334] "Generic (PLEG): container finished" podID="9b6e2d5a-0472-425b-b5b4-0b94f14ebfba" containerID="87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5" exitCode=0 Feb 16 19:42:22 crc kubenswrapper[4675]: I0216 19:42:22.050703 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" event={"ID":"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba","Type":"ContainerDied","Data":"87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5"} Feb 16 19:42:22 crc kubenswrapper[4675]: I0216 19:42:22.053245 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" event={"ID":"10414964-83d0-4d95-a89f-e3212a8015b5","Type":"ContainerStarted","Data":"8db3c34dee6cb538e5de05d8703350e3013401589489756aafbf8fed2ef597d6"} Feb 16 19:42:22 crc kubenswrapper[4675]: I0216 19:42:22.053303 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" event={"ID":"10414964-83d0-4d95-a89f-e3212a8015b5","Type":"ContainerStarted","Data":"92e33c01bc9214841f42d4ca6e58fb062a9e94fe39bdc5ad08ad2351cd270058"} Feb 16 19:42:22 crc kubenswrapper[4675]: I0216 19:42:22.066230 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"776ad1bc-395c-41c3-8c95-37a58a8cd4e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04dbf4934ecd5c1773fe7c1d032d68d59f41556af022bdf1c36f8de85221616a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abb4dfa515be84263cef6de8c300018193409908292cbfe0d19610ade9532ad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8d3160c906648c15ea910b1b2fa9223c716bbefac62b25bba4f83f3b50217b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd0dd7a8d986a0dd78568e33afa36fcd8bba88cf885d29b22484132bf0034c47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:22Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:22 crc kubenswrapper[4675]: I0216 19:42:22.078779 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04c59a1a3ae53156b955a90113bc3d3b4424b7dcc3baa91e8256fd0893751af8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:22Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:22 crc kubenswrapper[4675]: I0216 19:42:22.097462 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10e2a1727464113d2cd88dfb8ba52bb1944b3dfe67a0d40f5b2e57fe483e5890\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83a4afeccd65f013b07c0f90cbb94f765f918aaaeeca716935f940deb85403d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:22Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:22 crc kubenswrapper[4675]: I0216 19:42:22.109640 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-stxk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af599569-feb0-432a-9adf-1b72d1ac1a57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5944f34bb922394197cfb79978d19139f70724115954043828e9f609ca32945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-stxk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:22Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:22 crc kubenswrapper[4675]: I0216 19:42:22.121884 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:22Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:22 crc kubenswrapper[4675]: I0216 19:42:22.144077 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"358c3bab-4666-44d1-90fb-3affa22327e2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e0dcd3707a91b9a3869942408c144f0b4ca22cd64e99de5fe43f95094e44a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d3a4d507eb459edf9f2a3c58417ae320eba85a7b597808fdb0eabe7d7d5afbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd404f7efe7c09d47f0550b7cf66ac7193f315f60db74cd5609593546037c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bfda6b7519474256ab63750ba86955592a2e47a0e364e4170a19fe28270459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e9a715621e072260b3d9703159db5eaad4918ff7a2cb7689dcb914ccd617950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e8bfd47afcf3bbd1831b98e69a3ee5fe7b49dd04d4f2efc9d9dc8fda831c74a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e8bfd47afcf3bbd1831b98e69a3ee5fe7b49dd04d4f2efc9d9dc8fda831c74a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65752e9eeab5fea26d84c7d43a33947d3dbabdd31578480904db24d60d298a0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65752e9eeab5fea26d84c7d43a33947d3dbabdd31578480904db24d60d298a0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bd389e534c9b05c81db5cfb0a9f1f5e78e2f1ccbcce15953172836fb3fb6744f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd389e534c9b05c81db5cfb0a9f1f5e78e2f1ccbcce15953172836fb3fb6744f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:22Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:22 crc kubenswrapper[4675]: I0216 19:42:22.163531 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18c7d7a1f00611bdd6074314992cb6601ce0c043b831b6125010fe098c7e12ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:22Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:22 crc kubenswrapper[4675]: I0216 19:42:22.180109 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:22Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:22 crc kubenswrapper[4675]: I0216 19:42:22.205747 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:22Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:22 crc kubenswrapper[4675]: I0216 19:42:22.218940 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pj5xg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9a99563-d631-455f-8464-160e5619c610\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s94dl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pj5xg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:22Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:22 crc kubenswrapper[4675]: I0216 19:42:22.241029 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gpc5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:22Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:22 crc kubenswrapper[4675]: I0216 19:42:22.253471 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10414964-83d0-4d95-a89f-e3212a8015b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27l4j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27l4j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j7pnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:22Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:22 crc kubenswrapper[4675]: I0216 19:42:22.274173 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc6bac98-79c6-4970-ba8b-7996a4046f7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055f2d1052fed3e5e8d660243c3d0595e3168a670ea7a22ef51b199528ffe826\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beefddcc0b5e2e84c15063f467b9043ba9caea11fbc3f7fb8fc1ece3b8c8f75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dbd2e87a95c7964a1bafb765ec76d7f18e180cc0e47df9fde7b526b3717a107\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cc4f98152f284333d41071adaefada973e3396c3cdb2c6ee3c662f0043fe65b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c762d1facdba3325880ae6b1e2704c626ae67915808b527959795774e2a1ba62\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-16T19:42:02Z\\\",\\\"message\\\":\\\"W0216 19:42:01.228572 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0216 19:42:01.228958 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771270921 cert, and key in /tmp/serving-cert-1169484463/serving-signer.crt, /tmp/serving-cert-1169484463/serving-signer.key\\\\nI0216 19:42:01.683881 1 observer_polling.go:159] Starting file observer\\\\nW0216 19:42:01.687387 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0216 19:42:01.693158 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0216 19:42:01.694799 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1169484463/tls.crt::/tmp/serving-cert-1169484463/tls.key\\\\\\\"\\\\nF0216 19:42:02.048422 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://344f8bd7bee340cc040a2556a0e92c055ed7fbad886f85750b66d21ace766e8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:22Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:22 crc kubenswrapper[4675]: I0216 19:42:22.290998 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rg8bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f385df3d-0543-4189-88a6-2163c8b9b959\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rg8bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:22Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:22 crc kubenswrapper[4675]: I0216 19:42:22.300129 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f385df3d-0543-4189-88a6-2163c8b9b959-cni-binary-copy\") pod \"multus-additional-cni-plugins-rg8bp\" (UID: \"f385df3d-0543-4189-88a6-2163c8b9b959\") " pod="openshift-multus/multus-additional-cni-plugins-rg8bp" Feb 16 19:42:22 crc kubenswrapper[4675]: I0216 19:42:22.300166 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c9a99563-d631-455f-8464-160e5619c610-cni-binary-copy\") pod \"multus-pj5xg\" (UID: \"c9a99563-d631-455f-8464-160e5619c610\") " pod="openshift-multus/multus-pj5xg" Feb 16 19:42:22 crc kubenswrapper[4675]: I0216 19:42:22.300190 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 19:42:22 crc kubenswrapper[4675]: I0216 19:42:22.300221 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 19:42:22 crc kubenswrapper[4675]: E0216 19:42:22.300286 4675 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 16 19:42:22 crc kubenswrapper[4675]: E0216 19:42:22.300339 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-16 19:42:26.300321514 +0000 UTC m=+29.425611070 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 16 19:42:22 crc kubenswrapper[4675]: E0216 19:42:22.300509 4675 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 16 19:42:22 crc kubenswrapper[4675]: E0216 19:42:22.300677 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-16 19:42:26.300630352 +0000 UTC m=+29.425920088 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 16 19:42:22 crc kubenswrapper[4675]: I0216 19:42:22.300745 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f385df3d-0543-4189-88a6-2163c8b9b959-cni-binary-copy\") pod \"multus-additional-cni-plugins-rg8bp\" (UID: \"f385df3d-0543-4189-88a6-2163c8b9b959\") " pod="openshift-multus/multus-additional-cni-plugins-rg8bp" Feb 16 19:42:22 crc kubenswrapper[4675]: I0216 19:42:22.301034 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c9a99563-d631-455f-8464-160e5619c610-cni-binary-copy\") pod \"multus-pj5xg\" (UID: \"c9a99563-d631-455f-8464-160e5619c610\") " pod="openshift-multus/multus-pj5xg" Feb 16 19:42:22 crc kubenswrapper[4675]: I0216 19:42:22.314758 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gpc5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:22Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:22 crc kubenswrapper[4675]: I0216 19:42:22.326173 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10414964-83d0-4d95-a89f-e3212a8015b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db3c34dee6cb538e5de05d8703350e3013401589489756aafbf8fed2ef597d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27l4j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92e33c01bc9214841f42d4ca6e58fb062a9e94fe39bdc5ad08ad2351cd270058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27l4j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j7pnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:22Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:22 crc kubenswrapper[4675]: I0216 19:42:22.338744 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc6bac98-79c6-4970-ba8b-7996a4046f7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055f2d1052fed3e5e8d660243c3d0595e3168a670ea7a22ef51b199528ffe826\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beefddcc0b5e2e84c15063f467b9043ba9caea11fbc3f7fb8fc1ece3b8c8f75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dbd2e87a95c7964a1bafb765ec76d7f18e180cc0e47df9fde7b526b3717a107\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cc4f98152f284333d41071adaefada973e3396c3cdb2c6ee3c662f0043fe65b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c762d1facdba3325880ae6b1e2704c626ae67915808b527959795774e2a1ba62\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-16T19:42:02Z\\\",\\\"message\\\":\\\"W0216 19:42:01.228572 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0216 19:42:01.228958 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771270921 cert, and key in /tmp/serving-cert-1169484463/serving-signer.crt, /tmp/serving-cert-1169484463/serving-signer.key\\\\nI0216 19:42:01.683881 1 observer_polling.go:159] Starting file observer\\\\nW0216 19:42:01.687387 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0216 19:42:01.693158 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0216 19:42:01.694799 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1169484463/tls.crt::/tmp/serving-cert-1169484463/tls.key\\\\\\\"\\\\nF0216 19:42:02.048422 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://344f8bd7bee340cc040a2556a0e92c055ed7fbad886f85750b66d21ace766e8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:22Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:22 crc kubenswrapper[4675]: I0216 19:42:22.350310 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18c7d7a1f00611bdd6074314992cb6601ce0c043b831b6125010fe098c7e12ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:22Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:22 crc kubenswrapper[4675]: I0216 19:42:22.362534 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:22Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:22 crc kubenswrapper[4675]: I0216 19:42:22.374705 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:22Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:22 crc kubenswrapper[4675]: I0216 19:42:22.390490 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-pj5xg" Feb 16 19:42:22 crc kubenswrapper[4675]: I0216 19:42:22.391989 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pj5xg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9a99563-d631-455f-8464-160e5619c610\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s94dl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pj5xg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:22Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:22 crc kubenswrapper[4675]: I0216 19:42:22.401436 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 19:42:22 crc kubenswrapper[4675]: I0216 19:42:22.401580 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 19:42:22 crc kubenswrapper[4675]: I0216 19:42:22.401641 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 19:42:22 crc kubenswrapper[4675]: E0216 19:42:22.401815 4675 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 16 19:42:22 crc kubenswrapper[4675]: E0216 19:42:22.401842 4675 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 16 19:42:22 crc kubenswrapper[4675]: E0216 19:42:22.401853 4675 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 16 19:42:22 crc kubenswrapper[4675]: E0216 19:42:22.401900 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-16 19:42:26.401885304 +0000 UTC m=+29.527174860 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 16 19:42:22 crc kubenswrapper[4675]: E0216 19:42:22.402220 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 19:42:26.402209501 +0000 UTC m=+29.527499057 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:42:22 crc kubenswrapper[4675]: E0216 19:42:22.402286 4675 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 16 19:42:22 crc kubenswrapper[4675]: E0216 19:42:22.402302 4675 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 16 19:42:22 crc kubenswrapper[4675]: E0216 19:42:22.402315 4675 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 16 19:42:22 crc kubenswrapper[4675]: E0216 19:42:22.402343 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-16 19:42:26.402335305 +0000 UTC m=+29.527624861 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 16 19:42:22 crc kubenswrapper[4675]: W0216 19:42:22.404938 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9a99563_d631_455f_8464_160e5619c610.slice/crio-a96332f8dae0d5d2ecfb1471a577d7d96a35bd83d63ef7fc84e0c300e76fded9 WatchSource:0}: Error finding container a96332f8dae0d5d2ecfb1471a577d7d96a35bd83d63ef7fc84e0c300e76fded9: Status 404 returned error can't find the container with id a96332f8dae0d5d2ecfb1471a577d7d96a35bd83d63ef7fc84e0c300e76fded9 Feb 16 19:42:22 crc kubenswrapper[4675]: I0216 19:42:22.418233 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rg8bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f385df3d-0543-4189-88a6-2163c8b9b959\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rg8bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:22Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:22 crc kubenswrapper[4675]: I0216 19:42:22.426361 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-rg8bp" Feb 16 19:42:22 crc kubenswrapper[4675]: I0216 19:42:22.438929 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"776ad1bc-395c-41c3-8c95-37a58a8cd4e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04dbf4934ecd5c1773fe7c1d032d68d59f41556af022bdf1c36f8de85221616a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abb4dfa515be84263cef6de8c300018193409908292cbfe0d19610ade9532ad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8d3160c906648c15ea910b1b2fa9223c716bbefac62b25bba4f83f3b50217b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd0dd7a8d986a0dd78568e33afa36fcd8bba88cf885d29b22484132bf0034c47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:22Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:22 crc kubenswrapper[4675]: W0216 19:42:22.447354 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf385df3d_0543_4189_88a6_2163c8b9b959.slice/crio-e7e72d3c5be678a535d7609101415f98d5b91b4a57e0e707e3131b3f74d1e17d WatchSource:0}: Error finding container e7e72d3c5be678a535d7609101415f98d5b91b4a57e0e707e3131b3f74d1e17d: Status 404 returned error can't find the container with id e7e72d3c5be678a535d7609101415f98d5b91b4a57e0e707e3131b3f74d1e17d Feb 16 19:42:22 crc kubenswrapper[4675]: I0216 19:42:22.456769 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04c59a1a3ae53156b955a90113bc3d3b4424b7dcc3baa91e8256fd0893751af8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:22Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:22 crc kubenswrapper[4675]: I0216 19:42:22.475227 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10e2a1727464113d2cd88dfb8ba52bb1944b3dfe67a0d40f5b2e57fe483e5890\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83a4afeccd65f013b07c0f90cbb94f765f918aaaeeca716935f940deb85403d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:22Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:22 crc kubenswrapper[4675]: I0216 19:42:22.490554 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-stxk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af599569-feb0-432a-9adf-1b72d1ac1a57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5944f34bb922394197cfb79978d19139f70724115954043828e9f609ca32945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-stxk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:22Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:22 crc kubenswrapper[4675]: I0216 19:42:22.513985 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"358c3bab-4666-44d1-90fb-3affa22327e2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e0dcd3707a91b9a3869942408c144f0b4ca22cd64e99de5fe43f95094e44a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d3a4d507eb459edf9f2a3c58417ae320eba85a7b597808fdb0eabe7d7d5afbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd404f7efe7c09d47f0550b7cf66ac7193f315f60db74cd5609593546037c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bfda6b7519474256ab63750ba86955592a2e47a0e364e4170a19fe28270459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e9a715621e072260b3d9703159db5eaad4918ff7a2cb7689dcb914ccd617950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e8bfd47afcf3bbd1831b98e69a3ee5fe7b49dd04d4f2efc9d9dc8fda831c74a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e8bfd47afcf3bbd1831b98e69a3ee5fe7b49dd04d4f2efc9d9dc8fda831c74a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65752e9eeab5fea26d84c7d43a33947d3dbabdd31578480904db24d60d298a0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65752e9eeab5fea26d84c7d43a33947d3dbabdd31578480904db24d60d298a0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bd389e534c9b05c81db5cfb0a9f1f5e78e2f1ccbcce15953172836fb3fb6744f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd389e534c9b05c81db5cfb0a9f1f5e78e2f1ccbcce15953172836fb3fb6744f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:22Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:22 crc kubenswrapper[4675]: I0216 19:42:22.529499 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:22Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:22 crc kubenswrapper[4675]: I0216 19:42:22.562119 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-4vvwh"] Feb 16 19:42:22 crc kubenswrapper[4675]: I0216 19:42:22.562738 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-4vvwh" Feb 16 19:42:22 crc kubenswrapper[4675]: I0216 19:42:22.564805 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 16 19:42:22 crc kubenswrapper[4675]: I0216 19:42:22.565055 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 16 19:42:22 crc kubenswrapper[4675]: I0216 19:42:22.565227 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 16 19:42:22 crc kubenswrapper[4675]: I0216 19:42:22.565258 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 16 19:42:22 crc kubenswrapper[4675]: I0216 19:42:22.584303 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"776ad1bc-395c-41c3-8c95-37a58a8cd4e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04dbf4934ecd5c1773fe7c1d032d68d59f41556af022bdf1c36f8de85221616a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abb4dfa515be84263cef6de8c300018193409908292cbfe0d19610ade9532ad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8d3160c906648c15ea910b1b2fa9223c716bbefac62b25bba4f83f3b50217b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd0dd7a8d986a0dd78568e33afa36fcd8bba88cf885d29b22484132bf0034c47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:22Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:22 crc kubenswrapper[4675]: I0216 19:42:22.597831 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04c59a1a3ae53156b955a90113bc3d3b4424b7dcc3baa91e8256fd0893751af8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:22Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:22 crc kubenswrapper[4675]: I0216 19:42:22.611242 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10e2a1727464113d2cd88dfb8ba52bb1944b3dfe67a0d40f5b2e57fe483e5890\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83a4afeccd65f013b07c0f90cbb94f765f918aaaeeca716935f940deb85403d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:22Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:22 crc kubenswrapper[4675]: I0216 19:42:22.621356 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-stxk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af599569-feb0-432a-9adf-1b72d1ac1a57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5944f34bb922394197cfb79978d19139f70724115954043828e9f609ca32945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-stxk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:22Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:22 crc kubenswrapper[4675]: I0216 19:42:22.642509 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"358c3bab-4666-44d1-90fb-3affa22327e2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e0dcd3707a91b9a3869942408c144f0b4ca22cd64e99de5fe43f95094e44a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d3a4d507eb459edf9f2a3c58417ae320eba85a7b597808fdb0eabe7d7d5afbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd404f7efe7c09d47f0550b7cf66ac7193f315f60db74cd5609593546037c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bfda6b7519474256ab63750ba86955592a2e47a0e364e4170a19fe28270459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e9a715621e072260b3d9703159db5eaad4918ff7a2cb7689dcb914ccd617950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e8bfd47afcf3bbd1831b98e69a3ee5fe7b49dd04d4f2efc9d9dc8fda831c74a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e8bfd47afcf3bbd1831b98e69a3ee5fe7b49dd04d4f2efc9d9dc8fda831c74a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65752e9eeab5fea26d84c7d43a33947d3dbabdd31578480904db24d60d298a0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65752e9eeab5fea26d84c7d43a33947d3dbabdd31578480904db24d60d298a0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bd389e534c9b05c81db5cfb0a9f1f5e78e2f1ccbcce15953172836fb3fb6744f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd389e534c9b05c81db5cfb0a9f1f5e78e2f1ccbcce15953172836fb3fb6744f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:22Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:22 crc kubenswrapper[4675]: I0216 19:42:22.655643 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:22Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:22 crc kubenswrapper[4675]: I0216 19:42:22.668416 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:22Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:22 crc kubenswrapper[4675]: I0216 19:42:22.685176 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:22Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:22 crc kubenswrapper[4675]: I0216 19:42:22.697963 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pj5xg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9a99563-d631-455f-8464-160e5619c610\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s94dl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pj5xg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:22Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:22 crc kubenswrapper[4675]: I0216 19:42:22.705087 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2de9472b-0e23-4821-86b8-202bfd739aee-serviceca\") pod \"node-ca-4vvwh\" (UID: \"2de9472b-0e23-4821-86b8-202bfd739aee\") " pod="openshift-image-registry/node-ca-4vvwh" Feb 16 19:42:22 crc kubenswrapper[4675]: I0216 19:42:22.705135 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2de9472b-0e23-4821-86b8-202bfd739aee-host\") pod \"node-ca-4vvwh\" (UID: \"2de9472b-0e23-4821-86b8-202bfd739aee\") " pod="openshift-image-registry/node-ca-4vvwh" Feb 16 19:42:22 crc kubenswrapper[4675]: I0216 19:42:22.705225 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqw54\" (UniqueName: \"kubernetes.io/projected/2de9472b-0e23-4821-86b8-202bfd739aee-kube-api-access-cqw54\") pod \"node-ca-4vvwh\" (UID: \"2de9472b-0e23-4821-86b8-202bfd739aee\") " pod="openshift-image-registry/node-ca-4vvwh" Feb 16 19:42:22 crc kubenswrapper[4675]: I0216 19:42:22.714841 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gpc5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:22Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:22 crc kubenswrapper[4675]: I0216 19:42:22.726607 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10414964-83d0-4d95-a89f-e3212a8015b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db3c34dee6cb538e5de05d8703350e3013401589489756aafbf8fed2ef597d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27l4j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92e33c01bc9214841f42d4ca6e58fb062a9e94fe39bdc5ad08ad2351cd270058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27l4j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j7pnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:22Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:22 crc kubenswrapper[4675]: I0216 19:42:22.738207 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc6bac98-79c6-4970-ba8b-7996a4046f7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055f2d1052fed3e5e8d660243c3d0595e3168a670ea7a22ef51b199528ffe826\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beefddcc0b5e2e84c15063f467b9043ba9caea11fbc3f7fb8fc1ece3b8c8f75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dbd2e87a95c7964a1bafb765ec76d7f18e180cc0e47df9fde7b526b3717a107\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cc4f98152f284333d41071adaefada973e3396c3cdb2c6ee3c662f0043fe65b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c762d1facdba3325880ae6b1e2704c626ae67915808b527959795774e2a1ba62\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-16T19:42:02Z\\\",\\\"message\\\":\\\"W0216 19:42:01.228572 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0216 19:42:01.228958 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771270921 cert, and key in /tmp/serving-cert-1169484463/serving-signer.crt, /tmp/serving-cert-1169484463/serving-signer.key\\\\nI0216 19:42:01.683881 1 observer_polling.go:159] Starting file observer\\\\nW0216 19:42:01.687387 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0216 19:42:01.693158 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0216 19:42:01.694799 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1169484463/tls.crt::/tmp/serving-cert-1169484463/tls.key\\\\\\\"\\\\nF0216 19:42:02.048422 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://344f8bd7bee340cc040a2556a0e92c055ed7fbad886f85750b66d21ace766e8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:22Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:22 crc kubenswrapper[4675]: I0216 19:42:22.749786 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18c7d7a1f00611bdd6074314992cb6601ce0c043b831b6125010fe098c7e12ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:22Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:22 crc kubenswrapper[4675]: I0216 19:42:22.763801 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rg8bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f385df3d-0543-4189-88a6-2163c8b9b959\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rg8bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:22Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:22 crc kubenswrapper[4675]: I0216 19:42:22.775908 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4vvwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2de9472b-0e23-4821-86b8-202bfd739aee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqw54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4vvwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:22Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:22 crc kubenswrapper[4675]: I0216 19:42:22.799171 4675 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-29 11:32:01.54459863 +0000 UTC Feb 16 19:42:22 crc kubenswrapper[4675]: I0216 19:42:22.806270 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqw54\" (UniqueName: \"kubernetes.io/projected/2de9472b-0e23-4821-86b8-202bfd739aee-kube-api-access-cqw54\") pod \"node-ca-4vvwh\" (UID: \"2de9472b-0e23-4821-86b8-202bfd739aee\") " pod="openshift-image-registry/node-ca-4vvwh" Feb 16 19:42:22 crc kubenswrapper[4675]: I0216 19:42:22.806359 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2de9472b-0e23-4821-86b8-202bfd739aee-serviceca\") pod \"node-ca-4vvwh\" (UID: \"2de9472b-0e23-4821-86b8-202bfd739aee\") " pod="openshift-image-registry/node-ca-4vvwh" Feb 16 19:42:22 crc kubenswrapper[4675]: I0216 19:42:22.806393 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2de9472b-0e23-4821-86b8-202bfd739aee-host\") pod \"node-ca-4vvwh\" (UID: \"2de9472b-0e23-4821-86b8-202bfd739aee\") " pod="openshift-image-registry/node-ca-4vvwh" Feb 16 19:42:22 crc kubenswrapper[4675]: I0216 19:42:22.806461 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2de9472b-0e23-4821-86b8-202bfd739aee-host\") pod \"node-ca-4vvwh\" (UID: \"2de9472b-0e23-4821-86b8-202bfd739aee\") " pod="openshift-image-registry/node-ca-4vvwh" Feb 16 19:42:22 crc kubenswrapper[4675]: I0216 19:42:22.808025 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2de9472b-0e23-4821-86b8-202bfd739aee-serviceca\") pod \"node-ca-4vvwh\" (UID: \"2de9472b-0e23-4821-86b8-202bfd739aee\") " pod="openshift-image-registry/node-ca-4vvwh" Feb 16 19:42:22 crc kubenswrapper[4675]: I0216 19:42:22.828416 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqw54\" (UniqueName: \"kubernetes.io/projected/2de9472b-0e23-4821-86b8-202bfd739aee-kube-api-access-cqw54\") pod \"node-ca-4vvwh\" (UID: \"2de9472b-0e23-4821-86b8-202bfd739aee\") " pod="openshift-image-registry/node-ca-4vvwh" Feb 16 19:42:22 crc kubenswrapper[4675]: I0216 19:42:22.883254 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 19:42:22 crc kubenswrapper[4675]: I0216 19:42:22.883320 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 19:42:22 crc kubenswrapper[4675]: I0216 19:42:22.883376 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 19:42:22 crc kubenswrapper[4675]: E0216 19:42:22.883462 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 19:42:22 crc kubenswrapper[4675]: E0216 19:42:22.883607 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 19:42:22 crc kubenswrapper[4675]: E0216 19:42:22.883825 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 19:42:23 crc kubenswrapper[4675]: I0216 19:42:23.059144 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rg8bp" event={"ID":"f385df3d-0543-4189-88a6-2163c8b9b959","Type":"ContainerStarted","Data":"e7e72d3c5be678a535d7609101415f98d5b91b4a57e0e707e3131b3f74d1e17d"} Feb 16 19:42:23 crc kubenswrapper[4675]: I0216 19:42:23.060224 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pj5xg" event={"ID":"c9a99563-d631-455f-8464-160e5619c610","Type":"ContainerStarted","Data":"a96332f8dae0d5d2ecfb1471a577d7d96a35bd83d63ef7fc84e0c300e76fded9"} Feb 16 19:42:23 crc kubenswrapper[4675]: I0216 19:42:23.064526 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" event={"ID":"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba","Type":"ContainerStarted","Data":"958d71da673d02ed4c067f390a0a94c9f20052f6274ec975f9a4c3034efa7aaa"} Feb 16 19:42:23 crc kubenswrapper[4675]: I0216 19:42:23.064880 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" event={"ID":"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba","Type":"ContainerStarted","Data":"9296772284fdb3e0ac56ffd5e54f10e7fe1256ccb616d52f103479ebcc6e81f9"} Feb 16 19:42:23 crc kubenswrapper[4675]: I0216 19:42:23.064912 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" event={"ID":"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba","Type":"ContainerStarted","Data":"ace6ee814e8e429b6ec77e88a0bb2fff35cfbfd3497e7a4d7882e11f437ce5a4"} Feb 16 19:42:23 crc kubenswrapper[4675]: I0216 19:42:23.095028 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-4vvwh" Feb 16 19:42:23 crc kubenswrapper[4675]: I0216 19:42:23.799572 4675 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 16:17:11.781382 +0000 UTC Feb 16 19:42:24 crc kubenswrapper[4675]: I0216 19:42:24.069921 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" event={"ID":"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba","Type":"ContainerStarted","Data":"13af49ba7abbe43edde46130d212f6d76f30bab85a63aef18a84721887e29535"} Feb 16 19:42:24 crc kubenswrapper[4675]: I0216 19:42:24.069979 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" event={"ID":"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba","Type":"ContainerStarted","Data":"1e22795f3990f90b901e1ba026e5f95e72508dc865635d19ba881a291dbaed85"} Feb 16 19:42:24 crc kubenswrapper[4675]: I0216 19:42:24.069990 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" event={"ID":"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba","Type":"ContainerStarted","Data":"5282559b57497e15c0bd828702b4994e8d05f7c1a81ee44e582ce3bb3b64cbb8"} Feb 16 19:42:24 crc kubenswrapper[4675]: I0216 19:42:24.071445 4675 generic.go:334] "Generic (PLEG): container finished" podID="f385df3d-0543-4189-88a6-2163c8b9b959" containerID="993eb6f1a848a5279ad2e5d47f47c873918dbca13d2ff7db80bebbad6f948476" exitCode=0 Feb 16 19:42:24 crc kubenswrapper[4675]: I0216 19:42:24.071591 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rg8bp" event={"ID":"f385df3d-0543-4189-88a6-2163c8b9b959","Type":"ContainerDied","Data":"993eb6f1a848a5279ad2e5d47f47c873918dbca13d2ff7db80bebbad6f948476"} Feb 16 19:42:24 crc kubenswrapper[4675]: I0216 19:42:24.073460 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pj5xg" event={"ID":"c9a99563-d631-455f-8464-160e5619c610","Type":"ContainerStarted","Data":"798b2a0b186213c744675eb2bd1a33fcb05a8df10795373dbff1f91222fa17a9"} Feb 16 19:42:24 crc kubenswrapper[4675]: I0216 19:42:24.074476 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-4vvwh" event={"ID":"2de9472b-0e23-4821-86b8-202bfd739aee","Type":"ContainerStarted","Data":"e2a601bdf26e518854cf3a55f3aed6274701bc40b20fe0f0e7673e65a007f0b9"} Feb 16 19:42:24 crc kubenswrapper[4675]: I0216 19:42:24.074544 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-4vvwh" event={"ID":"2de9472b-0e23-4821-86b8-202bfd739aee","Type":"ContainerStarted","Data":"ca77edfdfe610f494d1325ad7fc201e41d24064160eceb818b8940effda15e03"} Feb 16 19:42:24 crc kubenswrapper[4675]: I0216 19:42:24.094560 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10e2a1727464113d2cd88dfb8ba52bb1944b3dfe67a0d40f5b2e57fe483e5890\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83a4afeccd65f013b07c0f90cbb94f765f918aaaeeca716935f940deb85403d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:24Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:24 crc kubenswrapper[4675]: I0216 19:42:24.109703 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-stxk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af599569-feb0-432a-9adf-1b72d1ac1a57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5944f34bb922394197cfb79978d19139f70724115954043828e9f609ca32945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-stxk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:24Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:24 crc kubenswrapper[4675]: I0216 19:42:24.136063 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"358c3bab-4666-44d1-90fb-3affa22327e2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e0dcd3707a91b9a3869942408c144f0b4ca22cd64e99de5fe43f95094e44a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d3a4d507eb459edf9f2a3c58417ae320eba85a7b597808fdb0eabe7d7d5afbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd404f7efe7c09d47f0550b7cf66ac7193f315f60db74cd5609593546037c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bfda6b7519474256ab63750ba86955592a2e47a0e364e4170a19fe28270459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e9a715621e072260b3d9703159db5eaad4918ff7a2cb7689dcb914ccd617950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e8bfd47afcf3bbd1831b98e69a3ee5fe7b49dd04d4f2efc9d9dc8fda831c74a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e8bfd47afcf3bbd1831b98e69a3ee5fe7b49dd04d4f2efc9d9dc8fda831c74a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65752e9eeab5fea26d84c7d43a33947d3dbabdd31578480904db24d60d298a0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65752e9eeab5fea26d84c7d43a33947d3dbabdd31578480904db24d60d298a0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bd389e534c9b05c81db5cfb0a9f1f5e78e2f1ccbcce15953172836fb3fb6744f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd389e534c9b05c81db5cfb0a9f1f5e78e2f1ccbcce15953172836fb3fb6744f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:24Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:24 crc kubenswrapper[4675]: I0216 19:42:24.149548 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:24Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:24 crc kubenswrapper[4675]: I0216 19:42:24.165535 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pj5xg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9a99563-d631-455f-8464-160e5619c610\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s94dl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pj5xg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:24Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:24 crc kubenswrapper[4675]: I0216 19:42:24.185278 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gpc5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:24Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:24 crc kubenswrapper[4675]: I0216 19:42:24.198632 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10414964-83d0-4d95-a89f-e3212a8015b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db3c34dee6cb538e5de05d8703350e3013401589489756aafbf8fed2ef597d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27l4j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92e33c01bc9214841f42d4ca6e58fb062a9e94fe39bdc5ad08ad2351cd270058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27l4j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j7pnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:24Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:24 crc kubenswrapper[4675]: I0216 19:42:24.212113 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc6bac98-79c6-4970-ba8b-7996a4046f7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055f2d1052fed3e5e8d660243c3d0595e3168a670ea7a22ef51b199528ffe826\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beefddcc0b5e2e84c15063f467b9043ba9caea11fbc3f7fb8fc1ece3b8c8f75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dbd2e87a95c7964a1bafb765ec76d7f18e180cc0e47df9fde7b526b3717a107\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cc4f98152f284333d41071adaefada973e3396c3cdb2c6ee3c662f0043fe65b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c762d1facdba3325880ae6b1e2704c626ae67915808b527959795774e2a1ba62\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-16T19:42:02Z\\\",\\\"message\\\":\\\"W0216 19:42:01.228572 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0216 19:42:01.228958 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771270921 cert, and key in /tmp/serving-cert-1169484463/serving-signer.crt, /tmp/serving-cert-1169484463/serving-signer.key\\\\nI0216 19:42:01.683881 1 observer_polling.go:159] Starting file observer\\\\nW0216 19:42:01.687387 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0216 19:42:01.693158 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0216 19:42:01.694799 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1169484463/tls.crt::/tmp/serving-cert-1169484463/tls.key\\\\\\\"\\\\nF0216 19:42:02.048422 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://344f8bd7bee340cc040a2556a0e92c055ed7fbad886f85750b66d21ace766e8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:24Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:24 crc kubenswrapper[4675]: I0216 19:42:24.234091 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18c7d7a1f00611bdd6074314992cb6601ce0c043b831b6125010fe098c7e12ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:24Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:24 crc kubenswrapper[4675]: I0216 19:42:24.259576 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:24Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:24 crc kubenswrapper[4675]: I0216 19:42:24.273737 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:24Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:24 crc kubenswrapper[4675]: I0216 19:42:24.297639 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rg8bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f385df3d-0543-4189-88a6-2163c8b9b959\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://993eb6f1a848a5279ad2e5d47f47c873918dbca13d2ff7db80bebbad6f948476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://993eb6f1a848a5279ad2e5d47f47c873918dbca13d2ff7db80bebbad6f948476\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rg8bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:24Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:24 crc kubenswrapper[4675]: I0216 19:42:24.316513 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4vvwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2de9472b-0e23-4821-86b8-202bfd739aee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqw54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4vvwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:24Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:24 crc kubenswrapper[4675]: I0216 19:42:24.330320 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"776ad1bc-395c-41c3-8c95-37a58a8cd4e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04dbf4934ecd5c1773fe7c1d032d68d59f41556af022bdf1c36f8de85221616a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abb4dfa515be84263cef6de8c300018193409908292cbfe0d19610ade9532ad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8d3160c906648c15ea910b1b2fa9223c716bbefac62b25bba4f83f3b50217b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd0dd7a8d986a0dd78568e33afa36fcd8bba88cf885d29b22484132bf0034c47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:24Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:24 crc kubenswrapper[4675]: I0216 19:42:24.344283 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04c59a1a3ae53156b955a90113bc3d3b4424b7dcc3baa91e8256fd0893751af8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:24Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:24 crc kubenswrapper[4675]: I0216 19:42:24.359057 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:24Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:24 crc kubenswrapper[4675]: I0216 19:42:24.385721 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"358c3bab-4666-44d1-90fb-3affa22327e2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e0dcd3707a91b9a3869942408c144f0b4ca22cd64e99de5fe43f95094e44a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d3a4d507eb459edf9f2a3c58417ae320eba85a7b597808fdb0eabe7d7d5afbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd404f7efe7c09d47f0550b7cf66ac7193f315f60db74cd5609593546037c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bfda6b7519474256ab63750ba86955592a2e47a0e364e4170a19fe28270459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e9a715621e072260b3d9703159db5eaad4918ff7a2cb7689dcb914ccd617950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e8bfd47afcf3bbd1831b98e69a3ee5fe7b49dd04d4f2efc9d9dc8fda831c74a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e8bfd47afcf3bbd1831b98e69a3ee5fe7b49dd04d4f2efc9d9dc8fda831c74a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65752e9eeab5fea26d84c7d43a33947d3dbabdd31578480904db24d60d298a0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65752e9eeab5fea26d84c7d43a33947d3dbabdd31578480904db24d60d298a0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bd389e534c9b05c81db5cfb0a9f1f5e78e2f1ccbcce15953172836fb3fb6744f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd389e534c9b05c81db5cfb0a9f1f5e78e2f1ccbcce15953172836fb3fb6744f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:24Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:24 crc kubenswrapper[4675]: I0216 19:42:24.403668 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18c7d7a1f00611bdd6074314992cb6601ce0c043b831b6125010fe098c7e12ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:24Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:24 crc kubenswrapper[4675]: I0216 19:42:24.424495 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:24Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:24 crc kubenswrapper[4675]: I0216 19:42:24.438606 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:24Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:24 crc kubenswrapper[4675]: I0216 19:42:24.453488 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pj5xg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9a99563-d631-455f-8464-160e5619c610\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://798b2a0b186213c744675eb2bd1a33fcb05a8df10795373dbff1f91222fa17a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s94dl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pj5xg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:24Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:24 crc kubenswrapper[4675]: I0216 19:42:24.476820 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gpc5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:24Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:24 crc kubenswrapper[4675]: I0216 19:42:24.490239 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10414964-83d0-4d95-a89f-e3212a8015b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db3c34dee6cb538e5de05d8703350e3013401589489756aafbf8fed2ef597d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27l4j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92e33c01bc9214841f42d4ca6e58fb062a9e94fe39bdc5ad08ad2351cd270058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27l4j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j7pnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:24Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:24 crc kubenswrapper[4675]: I0216 19:42:24.505626 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc6bac98-79c6-4970-ba8b-7996a4046f7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055f2d1052fed3e5e8d660243c3d0595e3168a670ea7a22ef51b199528ffe826\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beefddcc0b5e2e84c15063f467b9043ba9caea11fbc3f7fb8fc1ece3b8c8f75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dbd2e87a95c7964a1bafb765ec76d7f18e180cc0e47df9fde7b526b3717a107\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cc4f98152f284333d41071adaefada973e3396c3cdb2c6ee3c662f0043fe65b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c762d1facdba3325880ae6b1e2704c626ae67915808b527959795774e2a1ba62\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-16T19:42:02Z\\\",\\\"message\\\":\\\"W0216 19:42:01.228572 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0216 19:42:01.228958 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771270921 cert, and key in /tmp/serving-cert-1169484463/serving-signer.crt, /tmp/serving-cert-1169484463/serving-signer.key\\\\nI0216 19:42:01.683881 1 observer_polling.go:159] Starting file observer\\\\nW0216 19:42:01.687387 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0216 19:42:01.693158 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0216 19:42:01.694799 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1169484463/tls.crt::/tmp/serving-cert-1169484463/tls.key\\\\\\\"\\\\nF0216 19:42:02.048422 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://344f8bd7bee340cc040a2556a0e92c055ed7fbad886f85750b66d21ace766e8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:24Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:24 crc kubenswrapper[4675]: I0216 19:42:24.516735 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4vvwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2de9472b-0e23-4821-86b8-202bfd739aee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2a601bdf26e518854cf3a55f3aed6274701bc40b20fe0f0e7673e65a007f0b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqw54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4vvwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:24Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:24 crc kubenswrapper[4675]: I0216 19:42:24.535588 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rg8bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f385df3d-0543-4189-88a6-2163c8b9b959\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://993eb6f1a848a5279ad2e5d47f47c873918dbca13d2ff7db80bebbad6f948476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://993eb6f1a848a5279ad2e5d47f47c873918dbca13d2ff7db80bebbad6f948476\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rg8bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:24Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:24 crc kubenswrapper[4675]: I0216 19:42:24.553072 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04c59a1a3ae53156b955a90113bc3d3b4424b7dcc3baa91e8256fd0893751af8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:24Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:24 crc kubenswrapper[4675]: I0216 19:42:24.571677 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"776ad1bc-395c-41c3-8c95-37a58a8cd4e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04dbf4934ecd5c1773fe7c1d032d68d59f41556af022bdf1c36f8de85221616a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abb4dfa515be84263cef6de8c300018193409908292cbfe0d19610ade9532ad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8d3160c906648c15ea910b1b2fa9223c716bbefac62b25bba4f83f3b50217b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd0dd7a8d986a0dd78568e33afa36fcd8bba88cf885d29b22484132bf0034c47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:24Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:24 crc kubenswrapper[4675]: I0216 19:42:24.584489 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-stxk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af599569-feb0-432a-9adf-1b72d1ac1a57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5944f34bb922394197cfb79978d19139f70724115954043828e9f609ca32945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-stxk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:24Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:24 crc kubenswrapper[4675]: I0216 19:42:24.600871 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10e2a1727464113d2cd88dfb8ba52bb1944b3dfe67a0d40f5b2e57fe483e5890\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83a4afeccd65f013b07c0f90cbb94f765f918aaaeeca716935f940deb85403d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:24Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:24 crc kubenswrapper[4675]: I0216 19:42:24.800972 4675 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 11:20:52.296035419 +0000 UTC Feb 16 19:42:24 crc kubenswrapper[4675]: I0216 19:42:24.883986 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 19:42:24 crc kubenswrapper[4675]: E0216 19:42:24.884188 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 19:42:24 crc kubenswrapper[4675]: I0216 19:42:24.884706 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 19:42:24 crc kubenswrapper[4675]: I0216 19:42:24.884727 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 19:42:24 crc kubenswrapper[4675]: E0216 19:42:24.885135 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 19:42:24 crc kubenswrapper[4675]: E0216 19:42:24.885245 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 19:42:24 crc kubenswrapper[4675]: I0216 19:42:24.948043 4675 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 16 19:42:24 crc kubenswrapper[4675]: I0216 19:42:24.950574 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:24 crc kubenswrapper[4675]: I0216 19:42:24.950751 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:24 crc kubenswrapper[4675]: I0216 19:42:24.950837 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:24 crc kubenswrapper[4675]: I0216 19:42:24.951023 4675 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 16 19:42:24 crc kubenswrapper[4675]: I0216 19:42:24.957346 4675 kubelet_node_status.go:115] "Node was previously registered" node="crc" Feb 16 19:42:24 crc kubenswrapper[4675]: I0216 19:42:24.957638 4675 kubelet_node_status.go:79] "Successfully registered node" node="crc" Feb 16 19:42:24 crc kubenswrapper[4675]: I0216 19:42:24.958940 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:24 crc kubenswrapper[4675]: I0216 19:42:24.958984 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:24 crc kubenswrapper[4675]: I0216 19:42:24.958994 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:24 crc kubenswrapper[4675]: I0216 19:42:24.959010 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:24 crc kubenswrapper[4675]: I0216 19:42:24.959019 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:24Z","lastTransitionTime":"2026-02-16T19:42:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:24 crc kubenswrapper[4675]: E0216 19:42:24.978989 4675 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dc074e08-7429-4b93-941d-3d8335487f37\\\",\\\"systemUUID\\\":\\\"a98cec76-f68c-4af0-9517-b326e9616347\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:24Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:24 crc kubenswrapper[4675]: I0216 19:42:24.984064 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:24 crc kubenswrapper[4675]: I0216 19:42:24.984096 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:24 crc kubenswrapper[4675]: I0216 19:42:24.984104 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:24 crc kubenswrapper[4675]: I0216 19:42:24.984121 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:24 crc kubenswrapper[4675]: I0216 19:42:24.984133 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:24Z","lastTransitionTime":"2026-02-16T19:42:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:24 crc kubenswrapper[4675]: E0216 19:42:24.998303 4675 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dc074e08-7429-4b93-941d-3d8335487f37\\\",\\\"systemUUID\\\":\\\"a98cec76-f68c-4af0-9517-b326e9616347\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:24Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:25 crc kubenswrapper[4675]: I0216 19:42:25.003013 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:25 crc kubenswrapper[4675]: I0216 19:42:25.003043 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:25 crc kubenswrapper[4675]: I0216 19:42:25.003054 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:25 crc kubenswrapper[4675]: I0216 19:42:25.003070 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:25 crc kubenswrapper[4675]: I0216 19:42:25.003080 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:25Z","lastTransitionTime":"2026-02-16T19:42:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:25 crc kubenswrapper[4675]: E0216 19:42:25.016165 4675 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:42:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:42:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:42:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:42:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dc074e08-7429-4b93-941d-3d8335487f37\\\",\\\"systemUUID\\\":\\\"a98cec76-f68c-4af0-9517-b326e9616347\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:25Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:25 crc kubenswrapper[4675]: I0216 19:42:25.020090 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:25 crc kubenswrapper[4675]: I0216 19:42:25.020119 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:25 crc kubenswrapper[4675]: I0216 19:42:25.020129 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:25 crc kubenswrapper[4675]: I0216 19:42:25.020144 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:25 crc kubenswrapper[4675]: I0216 19:42:25.020155 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:25Z","lastTransitionTime":"2026-02-16T19:42:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:25 crc kubenswrapper[4675]: E0216 19:42:25.035642 4675 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:42:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:42:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:42:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:42:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dc074e08-7429-4b93-941d-3d8335487f37\\\",\\\"systemUUID\\\":\\\"a98cec76-f68c-4af0-9517-b326e9616347\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:25Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:25 crc kubenswrapper[4675]: I0216 19:42:25.039324 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:25 crc kubenswrapper[4675]: I0216 19:42:25.039373 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:25 crc kubenswrapper[4675]: I0216 19:42:25.039386 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:25 crc kubenswrapper[4675]: I0216 19:42:25.039402 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:25 crc kubenswrapper[4675]: I0216 19:42:25.039412 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:25Z","lastTransitionTime":"2026-02-16T19:42:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:25 crc kubenswrapper[4675]: E0216 19:42:25.052668 4675 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:42:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:42:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:42:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:42:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dc074e08-7429-4b93-941d-3d8335487f37\\\",\\\"systemUUID\\\":\\\"a98cec76-f68c-4af0-9517-b326e9616347\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:25Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:25 crc kubenswrapper[4675]: E0216 19:42:25.052852 4675 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 16 19:42:25 crc kubenswrapper[4675]: I0216 19:42:25.054968 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:25 crc kubenswrapper[4675]: I0216 19:42:25.055001 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:25 crc kubenswrapper[4675]: I0216 19:42:25.055012 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:25 crc kubenswrapper[4675]: I0216 19:42:25.055027 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:25 crc kubenswrapper[4675]: I0216 19:42:25.055038 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:25Z","lastTransitionTime":"2026-02-16T19:42:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:25 crc kubenswrapper[4675]: I0216 19:42:25.090951 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rg8bp" event={"ID":"f385df3d-0543-4189-88a6-2163c8b9b959","Type":"ContainerStarted","Data":"02bb51b679cec1e9962a9d52b4fa890f31df7d9c00ed9f4885e94ac52455341c"} Feb 16 19:42:25 crc kubenswrapper[4675]: I0216 19:42:25.105912 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"776ad1bc-395c-41c3-8c95-37a58a8cd4e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04dbf4934ecd5c1773fe7c1d032d68d59f41556af022bdf1c36f8de85221616a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abb4dfa515be84263cef6de8c300018193409908292cbfe0d19610ade9532ad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8d3160c906648c15ea910b1b2fa9223c716bbefac62b25bba4f83f3b50217b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd0dd7a8d986a0dd78568e33afa36fcd8bba88cf885d29b22484132bf0034c47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:25Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:25 crc kubenswrapper[4675]: I0216 19:42:25.119113 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04c59a1a3ae53156b955a90113bc3d3b4424b7dcc3baa91e8256fd0893751af8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:25Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:25 crc kubenswrapper[4675]: I0216 19:42:25.130252 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10e2a1727464113d2cd88dfb8ba52bb1944b3dfe67a0d40f5b2e57fe483e5890\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83a4afeccd65f013b07c0f90cbb94f765f918aaaeeca716935f940deb85403d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:25Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:25 crc kubenswrapper[4675]: I0216 19:42:25.141724 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-stxk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af599569-feb0-432a-9adf-1b72d1ac1a57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5944f34bb922394197cfb79978d19139f70724115954043828e9f609ca32945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-stxk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:25Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:25 crc kubenswrapper[4675]: I0216 19:42:25.154468 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:25Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:25 crc kubenswrapper[4675]: I0216 19:42:25.158802 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:25 crc kubenswrapper[4675]: I0216 19:42:25.158841 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:25 crc kubenswrapper[4675]: I0216 19:42:25.158851 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:25 crc kubenswrapper[4675]: I0216 19:42:25.158871 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:25 crc kubenswrapper[4675]: I0216 19:42:25.158882 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:25Z","lastTransitionTime":"2026-02-16T19:42:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:25 crc kubenswrapper[4675]: I0216 19:42:25.172715 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"358c3bab-4666-44d1-90fb-3affa22327e2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e0dcd3707a91b9a3869942408c144f0b4ca22cd64e99de5fe43f95094e44a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d3a4d507eb459edf9f2a3c58417ae320eba85a7b597808fdb0eabe7d7d5afbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd404f7efe7c09d47f0550b7cf66ac7193f315f60db74cd5609593546037c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bfda6b7519474256ab63750ba86955592a2e47a0e364e4170a19fe28270459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e9a715621e072260b3d9703159db5eaad4918ff7a2cb7689dcb914ccd617950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e8bfd47afcf3bbd1831b98e69a3ee5fe7b49dd04d4f2efc9d9dc8fda831c74a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e8bfd47afcf3bbd1831b98e69a3ee5fe7b49dd04d4f2efc9d9dc8fda831c74a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65752e9eeab5fea26d84c7d43a33947d3dbabdd31578480904db24d60d298a0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65752e9eeab5fea26d84c7d43a33947d3dbabdd31578480904db24d60d298a0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bd389e534c9b05c81db5cfb0a9f1f5e78e2f1ccbcce15953172836fb3fb6744f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd389e534c9b05c81db5cfb0a9f1f5e78e2f1ccbcce15953172836fb3fb6744f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:25Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:25 crc kubenswrapper[4675]: I0216 19:42:25.190639 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18c7d7a1f00611bdd6074314992cb6601ce0c043b831b6125010fe098c7e12ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:25Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:25 crc kubenswrapper[4675]: I0216 19:42:25.203261 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:25Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:25 crc kubenswrapper[4675]: I0216 19:42:25.217174 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:25Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:25 crc kubenswrapper[4675]: I0216 19:42:25.236183 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pj5xg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9a99563-d631-455f-8464-160e5619c610\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://798b2a0b186213c744675eb2bd1a33fcb05a8df10795373dbff1f91222fa17a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s94dl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pj5xg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:25Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:25 crc kubenswrapper[4675]: I0216 19:42:25.261173 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:25 crc kubenswrapper[4675]: I0216 19:42:25.261221 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:25 crc kubenswrapper[4675]: I0216 19:42:25.261231 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:25 crc kubenswrapper[4675]: I0216 19:42:25.261247 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:25 crc kubenswrapper[4675]: I0216 19:42:25.261260 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:25Z","lastTransitionTime":"2026-02-16T19:42:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:25 crc kubenswrapper[4675]: I0216 19:42:25.276901 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gpc5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:25Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:25 crc kubenswrapper[4675]: I0216 19:42:25.303654 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10414964-83d0-4d95-a89f-e3212a8015b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db3c34dee6cb538e5de05d8703350e3013401589489756aafbf8fed2ef597d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27l4j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92e33c01bc9214841f42d4ca6e58fb062a9e94fe39bdc5ad08ad2351cd270058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27l4j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j7pnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:25Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:25 crc kubenswrapper[4675]: I0216 19:42:25.333128 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc6bac98-79c6-4970-ba8b-7996a4046f7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055f2d1052fed3e5e8d660243c3d0595e3168a670ea7a22ef51b199528ffe826\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beefddcc0b5e2e84c15063f467b9043ba9caea11fbc3f7fb8fc1ece3b8c8f75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dbd2e87a95c7964a1bafb765ec76d7f18e180cc0e47df9fde7b526b3717a107\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cc4f98152f284333d41071adaefada973e3396c3cdb2c6ee3c662f0043fe65b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c762d1facdba3325880ae6b1e2704c626ae67915808b527959795774e2a1ba62\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-16T19:42:02Z\\\",\\\"message\\\":\\\"W0216 19:42:01.228572 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0216 19:42:01.228958 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771270921 cert, and key in /tmp/serving-cert-1169484463/serving-signer.crt, /tmp/serving-cert-1169484463/serving-signer.key\\\\nI0216 19:42:01.683881 1 observer_polling.go:159] Starting file observer\\\\nW0216 19:42:01.687387 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0216 19:42:01.693158 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0216 19:42:01.694799 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1169484463/tls.crt::/tmp/serving-cert-1169484463/tls.key\\\\\\\"\\\\nF0216 19:42:02.048422 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://344f8bd7bee340cc040a2556a0e92c055ed7fbad886f85750b66d21ace766e8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:25Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:25 crc kubenswrapper[4675]: I0216 19:42:25.363606 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:25 crc kubenswrapper[4675]: I0216 19:42:25.363650 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:25 crc kubenswrapper[4675]: I0216 19:42:25.363660 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:25 crc kubenswrapper[4675]: I0216 19:42:25.363676 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:25 crc kubenswrapper[4675]: I0216 19:42:25.363702 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:25Z","lastTransitionTime":"2026-02-16T19:42:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:25 crc kubenswrapper[4675]: I0216 19:42:25.365960 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rg8bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f385df3d-0543-4189-88a6-2163c8b9b959\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://993eb6f1a848a5279ad2e5d47f47c873918dbca13d2ff7db80bebbad6f948476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://993eb6f1a848a5279ad2e5d47f47c873918dbca13d2ff7db80bebbad6f948476\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02bb51b679cec1e9962a9d52b4fa890f31df7d9c00ed9f4885e94ac52455341c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rg8bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:25Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:25 crc kubenswrapper[4675]: I0216 19:42:25.378877 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4vvwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2de9472b-0e23-4821-86b8-202bfd739aee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2a601bdf26e518854cf3a55f3aed6274701bc40b20fe0f0e7673e65a007f0b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqw54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4vvwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:25Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:25 crc kubenswrapper[4675]: I0216 19:42:25.466732 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:25 crc kubenswrapper[4675]: I0216 19:42:25.466779 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:25 crc kubenswrapper[4675]: I0216 19:42:25.466788 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:25 crc kubenswrapper[4675]: I0216 19:42:25.466806 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:25 crc kubenswrapper[4675]: I0216 19:42:25.466820 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:25Z","lastTransitionTime":"2026-02-16T19:42:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:25 crc kubenswrapper[4675]: I0216 19:42:25.568281 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:25 crc kubenswrapper[4675]: I0216 19:42:25.568410 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:25 crc kubenswrapper[4675]: I0216 19:42:25.568469 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:25 crc kubenswrapper[4675]: I0216 19:42:25.568542 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:25 crc kubenswrapper[4675]: I0216 19:42:25.568608 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:25Z","lastTransitionTime":"2026-02-16T19:42:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:25 crc kubenswrapper[4675]: I0216 19:42:25.671546 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:25 crc kubenswrapper[4675]: I0216 19:42:25.671618 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:25 crc kubenswrapper[4675]: I0216 19:42:25.671631 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:25 crc kubenswrapper[4675]: I0216 19:42:25.671649 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:25 crc kubenswrapper[4675]: I0216 19:42:25.671662 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:25Z","lastTransitionTime":"2026-02-16T19:42:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:25 crc kubenswrapper[4675]: I0216 19:42:25.774707 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:25 crc kubenswrapper[4675]: I0216 19:42:25.774761 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:25 crc kubenswrapper[4675]: I0216 19:42:25.774775 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:25 crc kubenswrapper[4675]: I0216 19:42:25.774796 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:25 crc kubenswrapper[4675]: I0216 19:42:25.774814 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:25Z","lastTransitionTime":"2026-02-16T19:42:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:25 crc kubenswrapper[4675]: I0216 19:42:25.801388 4675 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-11 10:46:25.386068696 +0000 UTC Feb 16 19:42:25 crc kubenswrapper[4675]: I0216 19:42:25.877869 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:25 crc kubenswrapper[4675]: I0216 19:42:25.877945 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:25 crc kubenswrapper[4675]: I0216 19:42:25.877966 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:25 crc kubenswrapper[4675]: I0216 19:42:25.877994 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:25 crc kubenswrapper[4675]: I0216 19:42:25.878012 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:25Z","lastTransitionTime":"2026-02-16T19:42:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:25 crc kubenswrapper[4675]: I0216 19:42:25.980969 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:25 crc kubenswrapper[4675]: I0216 19:42:25.981005 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:25 crc kubenswrapper[4675]: I0216 19:42:25.981014 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:25 crc kubenswrapper[4675]: I0216 19:42:25.981029 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:25 crc kubenswrapper[4675]: I0216 19:42:25.981039 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:25Z","lastTransitionTime":"2026-02-16T19:42:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:26 crc kubenswrapper[4675]: I0216 19:42:26.083741 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:26 crc kubenswrapper[4675]: I0216 19:42:26.083801 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:26 crc kubenswrapper[4675]: I0216 19:42:26.083813 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:26 crc kubenswrapper[4675]: I0216 19:42:26.083835 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:26 crc kubenswrapper[4675]: I0216 19:42:26.083849 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:26Z","lastTransitionTime":"2026-02-16T19:42:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:26 crc kubenswrapper[4675]: I0216 19:42:26.098300 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" event={"ID":"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba","Type":"ContainerStarted","Data":"d4a449e69885bb506900d7dbb14ee3ff9eb2d1188267babb04fbe0bb8f057c48"} Feb 16 19:42:26 crc kubenswrapper[4675]: I0216 19:42:26.100332 4675 generic.go:334] "Generic (PLEG): container finished" podID="f385df3d-0543-4189-88a6-2163c8b9b959" containerID="02bb51b679cec1e9962a9d52b4fa890f31df7d9c00ed9f4885e94ac52455341c" exitCode=0 Feb 16 19:42:26 crc kubenswrapper[4675]: I0216 19:42:26.100399 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rg8bp" event={"ID":"f385df3d-0543-4189-88a6-2163c8b9b959","Type":"ContainerDied","Data":"02bb51b679cec1e9962a9d52b4fa890f31df7d9c00ed9f4885e94ac52455341c"} Feb 16 19:42:26 crc kubenswrapper[4675]: I0216 19:42:26.121617 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10e2a1727464113d2cd88dfb8ba52bb1944b3dfe67a0d40f5b2e57fe483e5890\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83a4afeccd65f013b07c0f90cbb94f765f918aaaeeca716935f940deb85403d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:26Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:26 crc kubenswrapper[4675]: I0216 19:42:26.138630 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-stxk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af599569-feb0-432a-9adf-1b72d1ac1a57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5944f34bb922394197cfb79978d19139f70724115954043828e9f609ca32945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-stxk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:26Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:26 crc kubenswrapper[4675]: I0216 19:42:26.167985 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"358c3bab-4666-44d1-90fb-3affa22327e2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e0dcd3707a91b9a3869942408c144f0b4ca22cd64e99de5fe43f95094e44a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d3a4d507eb459edf9f2a3c58417ae320eba85a7b597808fdb0eabe7d7d5afbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd404f7efe7c09d47f0550b7cf66ac7193f315f60db74cd5609593546037c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bfda6b7519474256ab63750ba86955592a2e47a0e364e4170a19fe28270459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e9a715621e072260b3d9703159db5eaad4918ff7a2cb7689dcb914ccd617950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e8bfd47afcf3bbd1831b98e69a3ee5fe7b49dd04d4f2efc9d9dc8fda831c74a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e8bfd47afcf3bbd1831b98e69a3ee5fe7b49dd04d4f2efc9d9dc8fda831c74a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65752e9eeab5fea26d84c7d43a33947d3dbabdd31578480904db24d60d298a0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65752e9eeab5fea26d84c7d43a33947d3dbabdd31578480904db24d60d298a0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bd389e534c9b05c81db5cfb0a9f1f5e78e2f1ccbcce15953172836fb3fb6744f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd389e534c9b05c81db5cfb0a9f1f5e78e2f1ccbcce15953172836fb3fb6744f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:26Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:26 crc kubenswrapper[4675]: I0216 19:42:26.186482 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:26 crc kubenswrapper[4675]: I0216 19:42:26.186833 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:26 crc kubenswrapper[4675]: I0216 19:42:26.186942 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:26 crc kubenswrapper[4675]: I0216 19:42:26.187033 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:26 crc kubenswrapper[4675]: I0216 19:42:26.187127 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:26Z","lastTransitionTime":"2026-02-16T19:42:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:26 crc kubenswrapper[4675]: I0216 19:42:26.187508 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:26Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:26 crc kubenswrapper[4675]: I0216 19:42:26.203948 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pj5xg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9a99563-d631-455f-8464-160e5619c610\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://798b2a0b186213c744675eb2bd1a33fcb05a8df10795373dbff1f91222fa17a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s94dl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pj5xg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:26Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:26 crc kubenswrapper[4675]: I0216 19:42:26.228081 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gpc5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:26Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:26 crc kubenswrapper[4675]: I0216 19:42:26.243037 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10414964-83d0-4d95-a89f-e3212a8015b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db3c34dee6cb538e5de05d8703350e3013401589489756aafbf8fed2ef597d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27l4j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92e33c01bc9214841f42d4ca6e58fb062a9e94fe39bdc5ad08ad2351cd270058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27l4j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j7pnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:26Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:26 crc kubenswrapper[4675]: I0216 19:42:26.259611 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc6bac98-79c6-4970-ba8b-7996a4046f7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055f2d1052fed3e5e8d660243c3d0595e3168a670ea7a22ef51b199528ffe826\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beefddcc0b5e2e84c15063f467b9043ba9caea11fbc3f7fb8fc1ece3b8c8f75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dbd2e87a95c7964a1bafb765ec76d7f18e180cc0e47df9fde7b526b3717a107\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cc4f98152f284333d41071adaefada973e3396c3cdb2c6ee3c662f0043fe65b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c762d1facdba3325880ae6b1e2704c626ae67915808b527959795774e2a1ba62\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-16T19:42:02Z\\\",\\\"message\\\":\\\"W0216 19:42:01.228572 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0216 19:42:01.228958 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771270921 cert, and key in /tmp/serving-cert-1169484463/serving-signer.crt, /tmp/serving-cert-1169484463/serving-signer.key\\\\nI0216 19:42:01.683881 1 observer_polling.go:159] Starting file observer\\\\nW0216 19:42:01.687387 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0216 19:42:01.693158 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0216 19:42:01.694799 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1169484463/tls.crt::/tmp/serving-cert-1169484463/tls.key\\\\\\\"\\\\nF0216 19:42:02.048422 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://344f8bd7bee340cc040a2556a0e92c055ed7fbad886f85750b66d21ace766e8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:26Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:26 crc kubenswrapper[4675]: I0216 19:42:26.276292 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18c7d7a1f00611bdd6074314992cb6601ce0c043b831b6125010fe098c7e12ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:26Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:26 crc kubenswrapper[4675]: I0216 19:42:26.289122 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:26Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:26 crc kubenswrapper[4675]: I0216 19:42:26.295510 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:26 crc kubenswrapper[4675]: I0216 19:42:26.295551 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:26 crc kubenswrapper[4675]: I0216 19:42:26.295564 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:26 crc kubenswrapper[4675]: I0216 19:42:26.295589 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:26 crc kubenswrapper[4675]: I0216 19:42:26.295606 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:26Z","lastTransitionTime":"2026-02-16T19:42:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:26 crc kubenswrapper[4675]: I0216 19:42:26.303234 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:26Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:26 crc kubenswrapper[4675]: I0216 19:42:26.320852 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rg8bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f385df3d-0543-4189-88a6-2163c8b9b959\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://993eb6f1a848a5279ad2e5d47f47c873918dbca13d2ff7db80bebbad6f948476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://993eb6f1a848a5279ad2e5d47f47c873918dbca13d2ff7db80bebbad6f948476\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02bb51b679cec1e9962a9d52b4fa890f31df7d9c00ed9f4885e94ac52455341c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02bb51b679cec1e9962a9d52b4fa890f31df7d9c00ed9f4885e94ac52455341c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rg8bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:26Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:26 crc kubenswrapper[4675]: I0216 19:42:26.333509 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4vvwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2de9472b-0e23-4821-86b8-202bfd739aee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2a601bdf26e518854cf3a55f3aed6274701bc40b20fe0f0e7673e65a007f0b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqw54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4vvwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:26Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:26 crc kubenswrapper[4675]: I0216 19:42:26.349426 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"776ad1bc-395c-41c3-8c95-37a58a8cd4e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04dbf4934ecd5c1773fe7c1d032d68d59f41556af022bdf1c36f8de85221616a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abb4dfa515be84263cef6de8c300018193409908292cbfe0d19610ade9532ad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8d3160c906648c15ea910b1b2fa9223c716bbefac62b25bba4f83f3b50217b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd0dd7a8d986a0dd78568e33afa36fcd8bba88cf885d29b22484132bf0034c47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:26Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:26 crc kubenswrapper[4675]: I0216 19:42:26.357846 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 19:42:26 crc kubenswrapper[4675]: I0216 19:42:26.357934 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 19:42:26 crc kubenswrapper[4675]: E0216 19:42:26.358054 4675 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 16 19:42:26 crc kubenswrapper[4675]: E0216 19:42:26.358146 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-16 19:42:34.358122408 +0000 UTC m=+37.483412164 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 16 19:42:26 crc kubenswrapper[4675]: E0216 19:42:26.358190 4675 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 16 19:42:26 crc kubenswrapper[4675]: E0216 19:42:26.358322 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-16 19:42:34.358294422 +0000 UTC m=+37.483583968 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 16 19:42:26 crc kubenswrapper[4675]: I0216 19:42:26.363445 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04c59a1a3ae53156b955a90113bc3d3b4424b7dcc3baa91e8256fd0893751af8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:26Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:26 crc kubenswrapper[4675]: I0216 19:42:26.398105 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:26 crc kubenswrapper[4675]: I0216 19:42:26.398558 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:26 crc kubenswrapper[4675]: I0216 19:42:26.398580 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:26 crc kubenswrapper[4675]: I0216 19:42:26.398608 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:26 crc kubenswrapper[4675]: I0216 19:42:26.398628 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:26Z","lastTransitionTime":"2026-02-16T19:42:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:26 crc kubenswrapper[4675]: I0216 19:42:26.458417 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 19:42:26 crc kubenswrapper[4675]: I0216 19:42:26.458553 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 19:42:26 crc kubenswrapper[4675]: I0216 19:42:26.458596 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 19:42:26 crc kubenswrapper[4675]: E0216 19:42:26.458776 4675 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 16 19:42:26 crc kubenswrapper[4675]: E0216 19:42:26.458798 4675 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 16 19:42:26 crc kubenswrapper[4675]: E0216 19:42:26.458813 4675 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 16 19:42:26 crc kubenswrapper[4675]: E0216 19:42:26.458858 4675 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 16 19:42:26 crc kubenswrapper[4675]: E0216 19:42:26.458885 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-16 19:42:34.458861897 +0000 UTC m=+37.584151473 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 16 19:42:26 crc kubenswrapper[4675]: E0216 19:42:26.458901 4675 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 16 19:42:26 crc kubenswrapper[4675]: E0216 19:42:26.458939 4675 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 16 19:42:26 crc kubenswrapper[4675]: E0216 19:42:26.459053 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 19:42:34.458903048 +0000 UTC m=+37.584192644 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:42:26 crc kubenswrapper[4675]: E0216 19:42:26.459107 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-16 19:42:34.459088043 +0000 UTC m=+37.584377589 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 16 19:42:26 crc kubenswrapper[4675]: I0216 19:42:26.509192 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:26 crc kubenswrapper[4675]: I0216 19:42:26.509236 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:26 crc kubenswrapper[4675]: I0216 19:42:26.509246 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:26 crc kubenswrapper[4675]: I0216 19:42:26.509264 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:26 crc kubenswrapper[4675]: I0216 19:42:26.509275 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:26Z","lastTransitionTime":"2026-02-16T19:42:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:26 crc kubenswrapper[4675]: I0216 19:42:26.612618 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:26 crc kubenswrapper[4675]: I0216 19:42:26.612729 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:26 crc kubenswrapper[4675]: I0216 19:42:26.612743 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:26 crc kubenswrapper[4675]: I0216 19:42:26.612764 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:26 crc kubenswrapper[4675]: I0216 19:42:26.612776 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:26Z","lastTransitionTime":"2026-02-16T19:42:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:26 crc kubenswrapper[4675]: I0216 19:42:26.716593 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:26 crc kubenswrapper[4675]: I0216 19:42:26.716641 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:26 crc kubenswrapper[4675]: I0216 19:42:26.716654 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:26 crc kubenswrapper[4675]: I0216 19:42:26.716673 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:26 crc kubenswrapper[4675]: I0216 19:42:26.716703 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:26Z","lastTransitionTime":"2026-02-16T19:42:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:26 crc kubenswrapper[4675]: I0216 19:42:26.801976 4675 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 20:57:49.248017918 +0000 UTC Feb 16 19:42:26 crc kubenswrapper[4675]: I0216 19:42:26.819141 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:26 crc kubenswrapper[4675]: I0216 19:42:26.819209 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:26 crc kubenswrapper[4675]: I0216 19:42:26.819235 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:26 crc kubenswrapper[4675]: I0216 19:42:26.819270 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:26 crc kubenswrapper[4675]: I0216 19:42:26.819294 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:26Z","lastTransitionTime":"2026-02-16T19:42:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:26 crc kubenswrapper[4675]: I0216 19:42:26.883965 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 19:42:26 crc kubenswrapper[4675]: I0216 19:42:26.884024 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 19:42:26 crc kubenswrapper[4675]: I0216 19:42:26.883976 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 19:42:26 crc kubenswrapper[4675]: E0216 19:42:26.884204 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 19:42:26 crc kubenswrapper[4675]: E0216 19:42:26.884359 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 19:42:26 crc kubenswrapper[4675]: E0216 19:42:26.884438 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 19:42:26 crc kubenswrapper[4675]: I0216 19:42:26.922318 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:26 crc kubenswrapper[4675]: I0216 19:42:26.922379 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:26 crc kubenswrapper[4675]: I0216 19:42:26.922395 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:26 crc kubenswrapper[4675]: I0216 19:42:26.922425 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:26 crc kubenswrapper[4675]: I0216 19:42:26.922445 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:26Z","lastTransitionTime":"2026-02-16T19:42:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:27 crc kubenswrapper[4675]: I0216 19:42:27.025442 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:27 crc kubenswrapper[4675]: I0216 19:42:27.025491 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:27 crc kubenswrapper[4675]: I0216 19:42:27.025503 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:27 crc kubenswrapper[4675]: I0216 19:42:27.025523 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:27 crc kubenswrapper[4675]: I0216 19:42:27.025537 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:27Z","lastTransitionTime":"2026-02-16T19:42:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:27 crc kubenswrapper[4675]: I0216 19:42:27.108241 4675 generic.go:334] "Generic (PLEG): container finished" podID="f385df3d-0543-4189-88a6-2163c8b9b959" containerID="0161d628c911b5e33369708f774f6f8313fb74d7fdc1266b7bfa346d5d6200a8" exitCode=0 Feb 16 19:42:27 crc kubenswrapper[4675]: I0216 19:42:27.108306 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rg8bp" event={"ID":"f385df3d-0543-4189-88a6-2163c8b9b959","Type":"ContainerDied","Data":"0161d628c911b5e33369708f774f6f8313fb74d7fdc1266b7bfa346d5d6200a8"} Feb 16 19:42:27 crc kubenswrapper[4675]: I0216 19:42:27.128364 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:27 crc kubenswrapper[4675]: I0216 19:42:27.128420 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:27 crc kubenswrapper[4675]: I0216 19:42:27.128440 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:27 crc kubenswrapper[4675]: I0216 19:42:27.128468 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:27 crc kubenswrapper[4675]: I0216 19:42:27.128486 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:27Z","lastTransitionTime":"2026-02-16T19:42:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:27 crc kubenswrapper[4675]: I0216 19:42:27.134192 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04c59a1a3ae53156b955a90113bc3d3b4424b7dcc3baa91e8256fd0893751af8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:27Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:27 crc kubenswrapper[4675]: I0216 19:42:27.160158 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"776ad1bc-395c-41c3-8c95-37a58a8cd4e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04dbf4934ecd5c1773fe7c1d032d68d59f41556af022bdf1c36f8de85221616a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abb4dfa515be84263cef6de8c300018193409908292cbfe0d19610ade9532ad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8d3160c906648c15ea910b1b2fa9223c716bbefac62b25bba4f83f3b50217b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd0dd7a8d986a0dd78568e33afa36fcd8bba88cf885d29b22484132bf0034c47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:27Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:27 crc kubenswrapper[4675]: I0216 19:42:27.175869 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-stxk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af599569-feb0-432a-9adf-1b72d1ac1a57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5944f34bb922394197cfb79978d19139f70724115954043828e9f609ca32945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-stxk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:27Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:27 crc kubenswrapper[4675]: I0216 19:42:27.196522 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10e2a1727464113d2cd88dfb8ba52bb1944b3dfe67a0d40f5b2e57fe483e5890\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83a4afeccd65f013b07c0f90cbb94f765f918aaaeeca716935f940deb85403d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:27Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:27 crc kubenswrapper[4675]: I0216 19:42:27.214671 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:27Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:27 crc kubenswrapper[4675]: I0216 19:42:27.232889 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:27 crc kubenswrapper[4675]: I0216 19:42:27.232977 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:27 crc kubenswrapper[4675]: I0216 19:42:27.232990 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:27 crc kubenswrapper[4675]: I0216 19:42:27.233010 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:27 crc kubenswrapper[4675]: I0216 19:42:27.233045 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:27Z","lastTransitionTime":"2026-02-16T19:42:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:27 crc kubenswrapper[4675]: I0216 19:42:27.239589 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"358c3bab-4666-44d1-90fb-3affa22327e2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e0dcd3707a91b9a3869942408c144f0b4ca22cd64e99de5fe43f95094e44a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d3a4d507eb459edf9f2a3c58417ae320eba85a7b597808fdb0eabe7d7d5afbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd404f7efe7c09d47f0550b7cf66ac7193f315f60db74cd5609593546037c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bfda6b7519474256ab63750ba86955592a2e47a0e364e4170a19fe28270459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e9a715621e072260b3d9703159db5eaad4918ff7a2cb7689dcb914ccd617950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e8bfd47afcf3bbd1831b98e69a3ee5fe7b49dd04d4f2efc9d9dc8fda831c74a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e8bfd47afcf3bbd1831b98e69a3ee5fe7b49dd04d4f2efc9d9dc8fda831c74a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65752e9eeab5fea26d84c7d43a33947d3dbabdd31578480904db24d60d298a0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65752e9eeab5fea26d84c7d43a33947d3dbabdd31578480904db24d60d298a0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bd389e534c9b05c81db5cfb0a9f1f5e78e2f1ccbcce15953172836fb3fb6744f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd389e534c9b05c81db5cfb0a9f1f5e78e2f1ccbcce15953172836fb3fb6744f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:27Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:27 crc kubenswrapper[4675]: I0216 19:42:27.257016 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18c7d7a1f00611bdd6074314992cb6601ce0c043b831b6125010fe098c7e12ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:27Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:27 crc kubenswrapper[4675]: I0216 19:42:27.274780 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:27Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:27 crc kubenswrapper[4675]: I0216 19:42:27.290885 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:27Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:27 crc kubenswrapper[4675]: I0216 19:42:27.308562 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pj5xg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9a99563-d631-455f-8464-160e5619c610\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://798b2a0b186213c744675eb2bd1a33fcb05a8df10795373dbff1f91222fa17a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s94dl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pj5xg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:27Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:27 crc kubenswrapper[4675]: I0216 19:42:27.334472 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gpc5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:27Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:27 crc kubenswrapper[4675]: I0216 19:42:27.336858 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:27 crc kubenswrapper[4675]: I0216 19:42:27.336899 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:27 crc kubenswrapper[4675]: I0216 19:42:27.336911 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:27 crc kubenswrapper[4675]: I0216 19:42:27.336931 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:27 crc kubenswrapper[4675]: I0216 19:42:27.336945 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:27Z","lastTransitionTime":"2026-02-16T19:42:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:27 crc kubenswrapper[4675]: I0216 19:42:27.350934 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10414964-83d0-4d95-a89f-e3212a8015b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db3c34dee6cb538e5de05d8703350e3013401589489756aafbf8fed2ef597d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27l4j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92e33c01bc9214841f42d4ca6e58fb062a9e94fe39bdc5ad08ad2351cd270058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27l4j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j7pnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:27Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:27 crc kubenswrapper[4675]: I0216 19:42:27.374118 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc6bac98-79c6-4970-ba8b-7996a4046f7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055f2d1052fed3e5e8d660243c3d0595e3168a670ea7a22ef51b199528ffe826\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beefddcc0b5e2e84c15063f467b9043ba9caea11fbc3f7fb8fc1ece3b8c8f75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dbd2e87a95c7964a1bafb765ec76d7f18e180cc0e47df9fde7b526b3717a107\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cc4f98152f284333d41071adaefada973e3396c3cdb2c6ee3c662f0043fe65b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c762d1facdba3325880ae6b1e2704c626ae67915808b527959795774e2a1ba62\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-16T19:42:02Z\\\",\\\"message\\\":\\\"W0216 19:42:01.228572 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0216 19:42:01.228958 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771270921 cert, and key in /tmp/serving-cert-1169484463/serving-signer.crt, /tmp/serving-cert-1169484463/serving-signer.key\\\\nI0216 19:42:01.683881 1 observer_polling.go:159] Starting file observer\\\\nW0216 19:42:01.687387 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0216 19:42:01.693158 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0216 19:42:01.694799 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1169484463/tls.crt::/tmp/serving-cert-1169484463/tls.key\\\\\\\"\\\\nF0216 19:42:02.048422 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://344f8bd7bee340cc040a2556a0e92c055ed7fbad886f85750b66d21ace766e8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:27Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:27 crc kubenswrapper[4675]: I0216 19:42:27.389020 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4vvwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2de9472b-0e23-4821-86b8-202bfd739aee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2a601bdf26e518854cf3a55f3aed6274701bc40b20fe0f0e7673e65a007f0b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqw54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4vvwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:27Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:27 crc kubenswrapper[4675]: I0216 19:42:27.409600 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rg8bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f385df3d-0543-4189-88a6-2163c8b9b959\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://993eb6f1a848a5279ad2e5d47f47c873918dbca13d2ff7db80bebbad6f948476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://993eb6f1a848a5279ad2e5d47f47c873918dbca13d2ff7db80bebbad6f948476\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02bb51b679cec1e9962a9d52b4fa890f31df7d9c00ed9f4885e94ac52455341c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02bb51b679cec1e9962a9d52b4fa890f31df7d9c00ed9f4885e94ac52455341c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0161d628c911b5e33369708f774f6f8313fb74d7fdc1266b7bfa346d5d6200a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0161d628c911b5e33369708f774f6f8313fb74d7fdc1266b7bfa346d5d6200a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rg8bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:27Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:27 crc kubenswrapper[4675]: I0216 19:42:27.440661 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:27 crc kubenswrapper[4675]: I0216 19:42:27.440736 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:27 crc kubenswrapper[4675]: I0216 19:42:27.440751 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:27 crc kubenswrapper[4675]: I0216 19:42:27.440773 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:27 crc kubenswrapper[4675]: I0216 19:42:27.440814 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:27Z","lastTransitionTime":"2026-02-16T19:42:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:27 crc kubenswrapper[4675]: I0216 19:42:27.545224 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:27 crc kubenswrapper[4675]: I0216 19:42:27.545313 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:27 crc kubenswrapper[4675]: I0216 19:42:27.545337 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:27 crc kubenswrapper[4675]: I0216 19:42:27.545371 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:27 crc kubenswrapper[4675]: I0216 19:42:27.545395 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:27Z","lastTransitionTime":"2026-02-16T19:42:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:27 crc kubenswrapper[4675]: I0216 19:42:27.585460 4675 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Feb 16 19:42:27 crc kubenswrapper[4675]: I0216 19:42:27.648807 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:27 crc kubenswrapper[4675]: I0216 19:42:27.649122 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:27 crc kubenswrapper[4675]: I0216 19:42:27.649450 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:27 crc kubenswrapper[4675]: I0216 19:42:27.649549 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:27 crc kubenswrapper[4675]: I0216 19:42:27.649625 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:27Z","lastTransitionTime":"2026-02-16T19:42:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:27 crc kubenswrapper[4675]: I0216 19:42:27.751959 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:27 crc kubenswrapper[4675]: I0216 19:42:27.752029 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:27 crc kubenswrapper[4675]: I0216 19:42:27.752041 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:27 crc kubenswrapper[4675]: I0216 19:42:27.752063 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:27 crc kubenswrapper[4675]: I0216 19:42:27.752076 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:27Z","lastTransitionTime":"2026-02-16T19:42:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:27 crc kubenswrapper[4675]: I0216 19:42:27.802633 4675 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 06:50:13.937040306 +0000 UTC Feb 16 19:42:27 crc kubenswrapper[4675]: I0216 19:42:27.855009 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:27 crc kubenswrapper[4675]: I0216 19:42:27.855068 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:27 crc kubenswrapper[4675]: I0216 19:42:27.855081 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:27 crc kubenswrapper[4675]: I0216 19:42:27.855105 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:27 crc kubenswrapper[4675]: I0216 19:42:27.855119 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:27Z","lastTransitionTime":"2026-02-16T19:42:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:27 crc kubenswrapper[4675]: I0216 19:42:27.901594 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"776ad1bc-395c-41c3-8c95-37a58a8cd4e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04dbf4934ecd5c1773fe7c1d032d68d59f41556af022bdf1c36f8de85221616a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abb4dfa515be84263cef6de8c300018193409908292cbfe0d19610ade9532ad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8d3160c906648c15ea910b1b2fa9223c716bbefac62b25bba4f83f3b50217b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd0dd7a8d986a0dd78568e33afa36fcd8bba88cf885d29b22484132bf0034c47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:27Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:27 crc kubenswrapper[4675]: I0216 19:42:27.916193 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04c59a1a3ae53156b955a90113bc3d3b4424b7dcc3baa91e8256fd0893751af8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:27Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:27 crc kubenswrapper[4675]: I0216 19:42:27.929304 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10e2a1727464113d2cd88dfb8ba52bb1944b3dfe67a0d40f5b2e57fe483e5890\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83a4afeccd65f013b07c0f90cbb94f765f918aaaeeca716935f940deb85403d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:27Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:27 crc kubenswrapper[4675]: I0216 19:42:27.945760 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-stxk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af599569-feb0-432a-9adf-1b72d1ac1a57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5944f34bb922394197cfb79978d19139f70724115954043828e9f609ca32945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-stxk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:27Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:27 crc kubenswrapper[4675]: I0216 19:42:27.956797 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:27 crc kubenswrapper[4675]: I0216 19:42:27.956835 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:27 crc kubenswrapper[4675]: I0216 19:42:27.956843 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:27 crc kubenswrapper[4675]: I0216 19:42:27.956861 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:27 crc kubenswrapper[4675]: I0216 19:42:27.956872 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:27Z","lastTransitionTime":"2026-02-16T19:42:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:27 crc kubenswrapper[4675]: I0216 19:42:27.966891 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"358c3bab-4666-44d1-90fb-3affa22327e2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e0dcd3707a91b9a3869942408c144f0b4ca22cd64e99de5fe43f95094e44a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d3a4d507eb459edf9f2a3c58417ae320eba85a7b597808fdb0eabe7d7d5afbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd404f7efe7c09d47f0550b7cf66ac7193f315f60db74cd5609593546037c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bfda6b7519474256ab63750ba86955592a2e47a0e364e4170a19fe28270459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e9a715621e072260b3d9703159db5eaad4918ff7a2cb7689dcb914ccd617950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e8bfd47afcf3bbd1831b98e69a3ee5fe7b49dd04d4f2efc9d9dc8fda831c74a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e8bfd47afcf3bbd1831b98e69a3ee5fe7b49dd04d4f2efc9d9dc8fda831c74a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65752e9eeab5fea26d84c7d43a33947d3dbabdd31578480904db24d60d298a0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65752e9eeab5fea26d84c7d43a33947d3dbabdd31578480904db24d60d298a0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bd389e534c9b05c81db5cfb0a9f1f5e78e2f1ccbcce15953172836fb3fb6744f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd389e534c9b05c81db5cfb0a9f1f5e78e2f1ccbcce15953172836fb3fb6744f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:27Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:27 crc kubenswrapper[4675]: I0216 19:42:27.980442 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:27Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:27 crc kubenswrapper[4675]: I0216 19:42:27.992657 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:27Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.015908 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pj5xg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9a99563-d631-455f-8464-160e5619c610\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://798b2a0b186213c744675eb2bd1a33fcb05a8df10795373dbff1f91222fa17a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s94dl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pj5xg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:28Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.053824 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gpc5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:28Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.059982 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.060027 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.060039 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.060062 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.060103 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:28Z","lastTransitionTime":"2026-02-16T19:42:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.066474 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10414964-83d0-4d95-a89f-e3212a8015b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db3c34dee6cb538e5de05d8703350e3013401589489756aafbf8fed2ef597d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27l4j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92e33c01bc9214841f42d4ca6e58fb062a9e94fe39bdc5ad08ad2351cd270058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27l4j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j7pnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:28Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.083579 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc6bac98-79c6-4970-ba8b-7996a4046f7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055f2d1052fed3e5e8d660243c3d0595e3168a670ea7a22ef51b199528ffe826\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beefddcc0b5e2e84c15063f467b9043ba9caea11fbc3f7fb8fc1ece3b8c8f75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dbd2e87a95c7964a1bafb765ec76d7f18e180cc0e47df9fde7b526b3717a107\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cc4f98152f284333d41071adaefada973e3396c3cdb2c6ee3c662f0043fe65b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c762d1facdba3325880ae6b1e2704c626ae67915808b527959795774e2a1ba62\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-16T19:42:02Z\\\",\\\"message\\\":\\\"W0216 19:42:01.228572 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0216 19:42:01.228958 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771270921 cert, and key in /tmp/serving-cert-1169484463/serving-signer.crt, /tmp/serving-cert-1169484463/serving-signer.key\\\\nI0216 19:42:01.683881 1 observer_polling.go:159] Starting file observer\\\\nW0216 19:42:01.687387 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0216 19:42:01.693158 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0216 19:42:01.694799 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1169484463/tls.crt::/tmp/serving-cert-1169484463/tls.key\\\\\\\"\\\\nF0216 19:42:02.048422 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://344f8bd7bee340cc040a2556a0e92c055ed7fbad886f85750b66d21ace766e8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:28Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.102398 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18c7d7a1f00611bdd6074314992cb6601ce0c043b831b6125010fe098c7e12ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:28Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.118171 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" event={"ID":"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba","Type":"ContainerStarted","Data":"33bb9f09621d4a84e0041c922049323caf0373f0ef87a05bba0d4237b8782666"} Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.118408 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:28Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.118742 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.123055 4675 generic.go:334] "Generic (PLEG): container finished" podID="f385df3d-0543-4189-88a6-2163c8b9b959" containerID="470da14b61b6a4b60e151726b8a40431c5a38f30416cb9e635e843541e8b28bb" exitCode=0 Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.123105 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rg8bp" event={"ID":"f385df3d-0543-4189-88a6-2163c8b9b959","Type":"ContainerDied","Data":"470da14b61b6a4b60e151726b8a40431c5a38f30416cb9e635e843541e8b28bb"} Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.140973 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rg8bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f385df3d-0543-4189-88a6-2163c8b9b959\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://993eb6f1a848a5279ad2e5d47f47c873918dbca13d2ff7db80bebbad6f948476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://993eb6f1a848a5279ad2e5d47f47c873918dbca13d2ff7db80bebbad6f948476\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02bb51b679cec1e9962a9d52b4fa890f31df7d9c00ed9f4885e94ac52455341c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02bb51b679cec1e9962a9d52b4fa890f31df7d9c00ed9f4885e94ac52455341c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0161d628c911b5e33369708f774f6f8313fb74d7fdc1266b7bfa346d5d6200a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0161d628c911b5e33369708f774f6f8313fb74d7fdc1266b7bfa346d5d6200a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rg8bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:28Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.153315 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4vvwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2de9472b-0e23-4821-86b8-202bfd739aee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2a601bdf26e518854cf3a55f3aed6274701bc40b20fe0f0e7673e65a007f0b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqw54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4vvwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:28Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.163543 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.163804 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.163830 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.163842 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.163855 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.163866 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:28Z","lastTransitionTime":"2026-02-16T19:42:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.167460 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10414964-83d0-4d95-a89f-e3212a8015b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db3c34dee6cb538e5de05d8703350e3013401589489756aafbf8fed2ef597d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27l4j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92e33c01bc9214841f42d4ca6e58fb062a9e94fe39bdc5ad08ad2351cd270058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27l4j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j7pnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:28Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.182544 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc6bac98-79c6-4970-ba8b-7996a4046f7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055f2d1052fed3e5e8d660243c3d0595e3168a670ea7a22ef51b199528ffe826\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beefddcc0b5e2e84c15063f467b9043ba9caea11fbc3f7fb8fc1ece3b8c8f75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dbd2e87a95c7964a1bafb765ec76d7f18e180cc0e47df9fde7b526b3717a107\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cc4f98152f284333d41071adaefada973e3396c3cdb2c6ee3c662f0043fe65b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c762d1facdba3325880ae6b1e2704c626ae67915808b527959795774e2a1ba62\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-16T19:42:02Z\\\",\\\"message\\\":\\\"W0216 19:42:01.228572 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0216 19:42:01.228958 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771270921 cert, and key in /tmp/serving-cert-1169484463/serving-signer.crt, /tmp/serving-cert-1169484463/serving-signer.key\\\\nI0216 19:42:01.683881 1 observer_polling.go:159] Starting file observer\\\\nW0216 19:42:01.687387 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0216 19:42:01.693158 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0216 19:42:01.694799 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1169484463/tls.crt::/tmp/serving-cert-1169484463/tls.key\\\\\\\"\\\\nF0216 19:42:02.048422 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://344f8bd7bee340cc040a2556a0e92c055ed7fbad886f85750b66d21ace766e8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:28Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.196581 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18c7d7a1f00611bdd6074314992cb6601ce0c043b831b6125010fe098c7e12ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:28Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.209723 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:28Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.224645 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:28Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.241824 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pj5xg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9a99563-d631-455f-8464-160e5619c610\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://798b2a0b186213c744675eb2bd1a33fcb05a8df10795373dbff1f91222fa17a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s94dl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pj5xg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:28Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.262940 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://958d71da673d02ed4c067f390a0a94c9f20052f6274ec975f9a4c3034efa7aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5282559b57497e15c0bd828702b4994e8d05f7c1a81ee44e582ce3bb3b64cbb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13af49ba7abbe43edde46130d212f6d76f30bab85a63aef18a84721887e29535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e22795f3990f90b901e1ba026e5f95e72508dc865635d19ba881a291dbaed85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9296772284fdb3e0ac56ffd5e54f10e7fe1256ccb616d52f103479ebcc6e81f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ace6ee814e8e429b6ec77e88a0bb2fff35cfbfd3497e7a4d7882e11f437ce5a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33bb9f09621d4a84e0041c922049323caf0373f0ef87a05bba0d4237b8782666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4a449e69885bb506900d7dbb14ee3ff9eb2d1188267babb04fbe0bb8f057c48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gpc5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:28Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.268314 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.268366 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.268379 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.268403 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.268417 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:28Z","lastTransitionTime":"2026-02-16T19:42:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.280418 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rg8bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f385df3d-0543-4189-88a6-2163c8b9b959\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://993eb6f1a848a5279ad2e5d47f47c873918dbca13d2ff7db80bebbad6f948476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://993eb6f1a848a5279ad2e5d47f47c873918dbca13d2ff7db80bebbad6f948476\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02bb51b679cec1e9962a9d52b4fa890f31df7d9c00ed9f4885e94ac52455341c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02bb51b679cec1e9962a9d52b4fa890f31df7d9c00ed9f4885e94ac52455341c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0161d628c911b5e33369708f774f6f8313fb74d7fdc1266b7bfa346d5d6200a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0161d628c911b5e33369708f774f6f8313fb74d7fdc1266b7bfa346d5d6200a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://470da14b61b6a4b60e151726b8a40431c5a38f30416cb9e635e843541e8b28bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://470da14b61b6a4b60e151726b8a40431c5a38f30416cb9e635e843541e8b28bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rg8bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:28Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.294224 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4vvwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2de9472b-0e23-4821-86b8-202bfd739aee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2a601bdf26e518854cf3a55f3aed6274701bc40b20fe0f0e7673e65a007f0b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqw54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4vvwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:28Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.307825 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"776ad1bc-395c-41c3-8c95-37a58a8cd4e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04dbf4934ecd5c1773fe7c1d032d68d59f41556af022bdf1c36f8de85221616a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abb4dfa515be84263cef6de8c300018193409908292cbfe0d19610ade9532ad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8d3160c906648c15ea910b1b2fa9223c716bbefac62b25bba4f83f3b50217b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd0dd7a8d986a0dd78568e33afa36fcd8bba88cf885d29b22484132bf0034c47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:28Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.322147 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04c59a1a3ae53156b955a90113bc3d3b4424b7dcc3baa91e8256fd0893751af8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:28Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.335170 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10e2a1727464113d2cd88dfb8ba52bb1944b3dfe67a0d40f5b2e57fe483e5890\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83a4afeccd65f013b07c0f90cbb94f765f918aaaeeca716935f940deb85403d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:28Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.347877 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-stxk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af599569-feb0-432a-9adf-1b72d1ac1a57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5944f34bb922394197cfb79978d19139f70724115954043828e9f609ca32945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-stxk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:28Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.371442 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.371479 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.371488 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.371507 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.371518 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:28Z","lastTransitionTime":"2026-02-16T19:42:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.375595 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"358c3bab-4666-44d1-90fb-3affa22327e2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e0dcd3707a91b9a3869942408c144f0b4ca22cd64e99de5fe43f95094e44a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d3a4d507eb459edf9f2a3c58417ae320eba85a7b597808fdb0eabe7d7d5afbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd404f7efe7c09d47f0550b7cf66ac7193f315f60db74cd5609593546037c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bfda6b7519474256ab63750ba86955592a2e47a0e364e4170a19fe28270459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e9a715621e072260b3d9703159db5eaad4918ff7a2cb7689dcb914ccd617950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e8bfd47afcf3bbd1831b98e69a3ee5fe7b49dd04d4f2efc9d9dc8fda831c74a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e8bfd47afcf3bbd1831b98e69a3ee5fe7b49dd04d4f2efc9d9dc8fda831c74a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65752e9eeab5fea26d84c7d43a33947d3dbabdd31578480904db24d60d298a0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65752e9eeab5fea26d84c7d43a33947d3dbabdd31578480904db24d60d298a0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bd389e534c9b05c81db5cfb0a9f1f5e78e2f1ccbcce15953172836fb3fb6744f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd389e534c9b05c81db5cfb0a9f1f5e78e2f1ccbcce15953172836fb3fb6744f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:28Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.390429 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:28Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.408247 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"776ad1bc-395c-41c3-8c95-37a58a8cd4e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04dbf4934ecd5c1773fe7c1d032d68d59f41556af022bdf1c36f8de85221616a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abb4dfa515be84263cef6de8c300018193409908292cbfe0d19610ade9532ad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8d3160c906648c15ea910b1b2fa9223c716bbefac62b25bba4f83f3b50217b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd0dd7a8d986a0dd78568e33afa36fcd8bba88cf885d29b22484132bf0034c47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:28Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.424961 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04c59a1a3ae53156b955a90113bc3d3b4424b7dcc3baa91e8256fd0893751af8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:28Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.442118 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10e2a1727464113d2cd88dfb8ba52bb1944b3dfe67a0d40f5b2e57fe483e5890\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83a4afeccd65f013b07c0f90cbb94f765f918aaaeeca716935f940deb85403d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:28Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.454929 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-stxk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af599569-feb0-432a-9adf-1b72d1ac1a57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5944f34bb922394197cfb79978d19139f70724115954043828e9f609ca32945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-stxk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:28Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.474021 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.474074 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.474091 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.474117 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.474132 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:28Z","lastTransitionTime":"2026-02-16T19:42:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.483117 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"358c3bab-4666-44d1-90fb-3affa22327e2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e0dcd3707a91b9a3869942408c144f0b4ca22cd64e99de5fe43f95094e44a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d3a4d507eb459edf9f2a3c58417ae320eba85a7b597808fdb0eabe7d7d5afbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd404f7efe7c09d47f0550b7cf66ac7193f315f60db74cd5609593546037c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bfda6b7519474256ab63750ba86955592a2e47a0e364e4170a19fe28270459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e9a715621e072260b3d9703159db5eaad4918ff7a2cb7689dcb914ccd617950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e8bfd47afcf3bbd1831b98e69a3ee5fe7b49dd04d4f2efc9d9dc8fda831c74a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e8bfd47afcf3bbd1831b98e69a3ee5fe7b49dd04d4f2efc9d9dc8fda831c74a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65752e9eeab5fea26d84c7d43a33947d3dbabdd31578480904db24d60d298a0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65752e9eeab5fea26d84c7d43a33947d3dbabdd31578480904db24d60d298a0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bd389e534c9b05c81db5cfb0a9f1f5e78e2f1ccbcce15953172836fb3fb6744f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd389e534c9b05c81db5cfb0a9f1f5e78e2f1ccbcce15953172836fb3fb6744f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:28Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.497051 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:28Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.512231 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:28Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.530522 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:28Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.550578 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pj5xg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9a99563-d631-455f-8464-160e5619c610\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://798b2a0b186213c744675eb2bd1a33fcb05a8df10795373dbff1f91222fa17a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s94dl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pj5xg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:28Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.569238 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://958d71da673d02ed4c067f390a0a94c9f20052f6274ec975f9a4c3034efa7aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5282559b57497e15c0bd828702b4994e8d05f7c1a81ee44e582ce3bb3b64cbb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13af49ba7abbe43edde46130d212f6d76f30bab85a63aef18a84721887e29535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e22795f3990f90b901e1ba026e5f95e72508dc865635d19ba881a291dbaed85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9296772284fdb3e0ac56ffd5e54f10e7fe1256ccb616d52f103479ebcc6e81f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ace6ee814e8e429b6ec77e88a0bb2fff35cfbfd3497e7a4d7882e11f437ce5a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33bb9f09621d4a84e0041c922049323caf0373f0ef87a05bba0d4237b8782666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4a449e69885bb506900d7dbb14ee3ff9eb2d1188267babb04fbe0bb8f057c48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gpc5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:28Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.577816 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.577853 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.577863 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.577882 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.577893 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:28Z","lastTransitionTime":"2026-02-16T19:42:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.591125 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10414964-83d0-4d95-a89f-e3212a8015b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db3c34dee6cb538e5de05d8703350e3013401589489756aafbf8fed2ef597d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27l4j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92e33c01bc9214841f42d4ca6e58fb062a9e94fe39bdc5ad08ad2351cd270058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27l4j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j7pnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:28Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.608163 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc6bac98-79c6-4970-ba8b-7996a4046f7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055f2d1052fed3e5e8d660243c3d0595e3168a670ea7a22ef51b199528ffe826\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beefddcc0b5e2e84c15063f467b9043ba9caea11fbc3f7fb8fc1ece3b8c8f75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dbd2e87a95c7964a1bafb765ec76d7f18e180cc0e47df9fde7b526b3717a107\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cc4f98152f284333d41071adaefada973e3396c3cdb2c6ee3c662f0043fe65b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c762d1facdba3325880ae6b1e2704c626ae67915808b527959795774e2a1ba62\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-16T19:42:02Z\\\",\\\"message\\\":\\\"W0216 19:42:01.228572 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0216 19:42:01.228958 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771270921 cert, and key in /tmp/serving-cert-1169484463/serving-signer.crt, /tmp/serving-cert-1169484463/serving-signer.key\\\\nI0216 19:42:01.683881 1 observer_polling.go:159] Starting file observer\\\\nW0216 19:42:01.687387 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0216 19:42:01.693158 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0216 19:42:01.694799 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1169484463/tls.crt::/tmp/serving-cert-1169484463/tls.key\\\\\\\"\\\\nF0216 19:42:02.048422 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://344f8bd7bee340cc040a2556a0e92c055ed7fbad886f85750b66d21ace766e8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:28Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.621926 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18c7d7a1f00611bdd6074314992cb6601ce0c043b831b6125010fe098c7e12ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:28Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.635206 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rg8bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f385df3d-0543-4189-88a6-2163c8b9b959\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://993eb6f1a848a5279ad2e5d47f47c873918dbca13d2ff7db80bebbad6f948476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://993eb6f1a848a5279ad2e5d47f47c873918dbca13d2ff7db80bebbad6f948476\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02bb51b679cec1e9962a9d52b4fa890f31df7d9c00ed9f4885e94ac52455341c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02bb51b679cec1e9962a9d52b4fa890f31df7d9c00ed9f4885e94ac52455341c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0161d628c911b5e33369708f774f6f8313fb74d7fdc1266b7bfa346d5d6200a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0161d628c911b5e33369708f774f6f8313fb74d7fdc1266b7bfa346d5d6200a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://470da14b61b6a4b60e151726b8a40431c5a38f30416cb9e635e843541e8b28bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://470da14b61b6a4b60e151726b8a40431c5a38f30416cb9e635e843541e8b28bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rg8bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:28Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.650062 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4vvwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2de9472b-0e23-4821-86b8-202bfd739aee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2a601bdf26e518854cf3a55f3aed6274701bc40b20fe0f0e7673e65a007f0b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqw54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4vvwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:28Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.681144 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.681187 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.681203 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.681225 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.681242 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:28Z","lastTransitionTime":"2026-02-16T19:42:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.784034 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.784096 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.784116 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.784135 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.784146 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:28Z","lastTransitionTime":"2026-02-16T19:42:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.803745 4675 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 07:42:49.64770994 +0000 UTC Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.884267 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.884444 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.884557 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 19:42:28 crc kubenswrapper[4675]: E0216 19:42:28.884543 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 19:42:28 crc kubenswrapper[4675]: E0216 19:42:28.885208 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 19:42:28 crc kubenswrapper[4675]: E0216 19:42:28.885201 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.886046 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.886079 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.886089 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.886102 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.886112 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:28Z","lastTransitionTime":"2026-02-16T19:42:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.988666 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.988730 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.988741 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.988758 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:28 crc kubenswrapper[4675]: I0216 19:42:28.988769 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:28Z","lastTransitionTime":"2026-02-16T19:42:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:29 crc kubenswrapper[4675]: I0216 19:42:29.092705 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:29 crc kubenswrapper[4675]: I0216 19:42:29.092748 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:29 crc kubenswrapper[4675]: I0216 19:42:29.092756 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:29 crc kubenswrapper[4675]: I0216 19:42:29.092775 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:29 crc kubenswrapper[4675]: I0216 19:42:29.092786 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:29Z","lastTransitionTime":"2026-02-16T19:42:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:29 crc kubenswrapper[4675]: I0216 19:42:29.130560 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rg8bp" event={"ID":"f385df3d-0543-4189-88a6-2163c8b9b959","Type":"ContainerStarted","Data":"fbd74b2f78bf4719b66ec38da6783f0e4ee8b93d27d6c27907212e906c1cd8e4"} Feb 16 19:42:29 crc kubenswrapper[4675]: I0216 19:42:29.130715 4675 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 16 19:42:29 crc kubenswrapper[4675]: I0216 19:42:29.131312 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" Feb 16 19:42:29 crc kubenswrapper[4675]: I0216 19:42:29.148371 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc6bac98-79c6-4970-ba8b-7996a4046f7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055f2d1052fed3e5e8d660243c3d0595e3168a670ea7a22ef51b199528ffe826\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beefddcc0b5e2e84c15063f467b9043ba9caea11fbc3f7fb8fc1ece3b8c8f75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dbd2e87a95c7964a1bafb765ec76d7f18e180cc0e47df9fde7b526b3717a107\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cc4f98152f284333d41071adaefada973e3396c3cdb2c6ee3c662f0043fe65b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c762d1facdba3325880ae6b1e2704c626ae67915808b527959795774e2a1ba62\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-16T19:42:02Z\\\",\\\"message\\\":\\\"W0216 19:42:01.228572 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0216 19:42:01.228958 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771270921 cert, and key in /tmp/serving-cert-1169484463/serving-signer.crt, /tmp/serving-cert-1169484463/serving-signer.key\\\\nI0216 19:42:01.683881 1 observer_polling.go:159] Starting file observer\\\\nW0216 19:42:01.687387 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0216 19:42:01.693158 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0216 19:42:01.694799 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1169484463/tls.crt::/tmp/serving-cert-1169484463/tls.key\\\\\\\"\\\\nF0216 19:42:02.048422 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://344f8bd7bee340cc040a2556a0e92c055ed7fbad886f85750b66d21ace766e8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:29Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:29 crc kubenswrapper[4675]: I0216 19:42:29.157620 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" Feb 16 19:42:29 crc kubenswrapper[4675]: I0216 19:42:29.169825 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18c7d7a1f00611bdd6074314992cb6601ce0c043b831b6125010fe098c7e12ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:29Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:29 crc kubenswrapper[4675]: I0216 19:42:29.189582 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:29Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:29 crc kubenswrapper[4675]: I0216 19:42:29.195538 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:29 crc kubenswrapper[4675]: I0216 19:42:29.195612 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:29 crc kubenswrapper[4675]: I0216 19:42:29.195633 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:29 crc kubenswrapper[4675]: I0216 19:42:29.195660 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:29 crc kubenswrapper[4675]: I0216 19:42:29.195701 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:29Z","lastTransitionTime":"2026-02-16T19:42:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:29 crc kubenswrapper[4675]: I0216 19:42:29.210091 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:29Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:29 crc kubenswrapper[4675]: I0216 19:42:29.238724 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pj5xg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9a99563-d631-455f-8464-160e5619c610\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://798b2a0b186213c744675eb2bd1a33fcb05a8df10795373dbff1f91222fa17a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s94dl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pj5xg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:29Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:29 crc kubenswrapper[4675]: I0216 19:42:29.270329 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://958d71da673d02ed4c067f390a0a94c9f20052f6274ec975f9a4c3034efa7aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5282559b57497e15c0bd828702b4994e8d05f7c1a81ee44e582ce3bb3b64cbb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13af49ba7abbe43edde46130d212f6d76f30bab85a63aef18a84721887e29535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e22795f3990f90b901e1ba026e5f95e72508dc865635d19ba881a291dbaed85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9296772284fdb3e0ac56ffd5e54f10e7fe1256ccb616d52f103479ebcc6e81f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ace6ee814e8e429b6ec77e88a0bb2fff35cfbfd3497e7a4d7882e11f437ce5a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33bb9f09621d4a84e0041c922049323caf0373f0ef87a05bba0d4237b8782666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4a449e69885bb506900d7dbb14ee3ff9eb2d1188267babb04fbe0bb8f057c48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gpc5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:29Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:29 crc kubenswrapper[4675]: I0216 19:42:29.286404 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10414964-83d0-4d95-a89f-e3212a8015b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db3c34dee6cb538e5de05d8703350e3013401589489756aafbf8fed2ef597d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27l4j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92e33c01bc9214841f42d4ca6e58fb062a9e94fe39bdc5ad08ad2351cd270058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27l4j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j7pnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:29Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:29 crc kubenswrapper[4675]: I0216 19:42:29.298483 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:29 crc kubenswrapper[4675]: I0216 19:42:29.298566 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:29 crc kubenswrapper[4675]: I0216 19:42:29.298589 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:29 crc kubenswrapper[4675]: I0216 19:42:29.298617 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:29 crc kubenswrapper[4675]: I0216 19:42:29.298637 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:29Z","lastTransitionTime":"2026-02-16T19:42:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:29 crc kubenswrapper[4675]: I0216 19:42:29.310803 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rg8bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f385df3d-0543-4189-88a6-2163c8b9b959\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://993eb6f1a848a5279ad2e5d47f47c873918dbca13d2ff7db80bebbad6f948476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://993eb6f1a848a5279ad2e5d47f47c873918dbca13d2ff7db80bebbad6f948476\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02bb51b679cec1e9962a9d52b4fa890f31df7d9c00ed9f4885e94ac52455341c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02bb51b679cec1e9962a9d52b4fa890f31df7d9c00ed9f4885e94ac52455341c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0161d628c911b5e33369708f774f6f8313fb74d7fdc1266b7bfa346d5d6200a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0161d628c911b5e33369708f774f6f8313fb74d7fdc1266b7bfa346d5d6200a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://470da14b61b6a4b60e151726b8a40431c5a38f30416cb9e635e843541e8b28bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://470da14b61b6a4b60e151726b8a40431c5a38f30416cb9e635e843541e8b28bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd74b2f78bf4719b66ec38da6783f0e4ee8b93d27d6c27907212e906c1cd8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rg8bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:29Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:29 crc kubenswrapper[4675]: I0216 19:42:29.324264 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4vvwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2de9472b-0e23-4821-86b8-202bfd739aee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2a601bdf26e518854cf3a55f3aed6274701bc40b20fe0f0e7673e65a007f0b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqw54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4vvwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:29Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:29 crc kubenswrapper[4675]: I0216 19:42:29.338350 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"776ad1bc-395c-41c3-8c95-37a58a8cd4e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04dbf4934ecd5c1773fe7c1d032d68d59f41556af022bdf1c36f8de85221616a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abb4dfa515be84263cef6de8c300018193409908292cbfe0d19610ade9532ad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8d3160c906648c15ea910b1b2fa9223c716bbefac62b25bba4f83f3b50217b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd0dd7a8d986a0dd78568e33afa36fcd8bba88cf885d29b22484132bf0034c47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:29Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:29 crc kubenswrapper[4675]: I0216 19:42:29.349767 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04c59a1a3ae53156b955a90113bc3d3b4424b7dcc3baa91e8256fd0893751af8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:29Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:29 crc kubenswrapper[4675]: I0216 19:42:29.365225 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10e2a1727464113d2cd88dfb8ba52bb1944b3dfe67a0d40f5b2e57fe483e5890\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83a4afeccd65f013b07c0f90cbb94f765f918aaaeeca716935f940deb85403d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:29Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:29 crc kubenswrapper[4675]: I0216 19:42:29.379096 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-stxk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af599569-feb0-432a-9adf-1b72d1ac1a57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5944f34bb922394197cfb79978d19139f70724115954043828e9f609ca32945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-stxk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:29Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:29 crc kubenswrapper[4675]: I0216 19:42:29.402966 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:29 crc kubenswrapper[4675]: I0216 19:42:29.403024 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:29 crc kubenswrapper[4675]: I0216 19:42:29.403044 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:29 crc kubenswrapper[4675]: I0216 19:42:29.403066 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:29 crc kubenswrapper[4675]: I0216 19:42:29.403080 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:29Z","lastTransitionTime":"2026-02-16T19:42:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:29 crc kubenswrapper[4675]: I0216 19:42:29.404911 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"358c3bab-4666-44d1-90fb-3affa22327e2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e0dcd3707a91b9a3869942408c144f0b4ca22cd64e99de5fe43f95094e44a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d3a4d507eb459edf9f2a3c58417ae320eba85a7b597808fdb0eabe7d7d5afbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd404f7efe7c09d47f0550b7cf66ac7193f315f60db74cd5609593546037c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bfda6b7519474256ab63750ba86955592a2e47a0e364e4170a19fe28270459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e9a715621e072260b3d9703159db5eaad4918ff7a2cb7689dcb914ccd617950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e8bfd47afcf3bbd1831b98e69a3ee5fe7b49dd04d4f2efc9d9dc8fda831c74a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e8bfd47afcf3bbd1831b98e69a3ee5fe7b49dd04d4f2efc9d9dc8fda831c74a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65752e9eeab5fea26d84c7d43a33947d3dbabdd31578480904db24d60d298a0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65752e9eeab5fea26d84c7d43a33947d3dbabdd31578480904db24d60d298a0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bd389e534c9b05c81db5cfb0a9f1f5e78e2f1ccbcce15953172836fb3fb6744f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd389e534c9b05c81db5cfb0a9f1f5e78e2f1ccbcce15953172836fb3fb6744f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:29Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:29 crc kubenswrapper[4675]: I0216 19:42:29.425500 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:29Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:29 crc kubenswrapper[4675]: I0216 19:42:29.447609 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18c7d7a1f00611bdd6074314992cb6601ce0c043b831b6125010fe098c7e12ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:29Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:29 crc kubenswrapper[4675]: I0216 19:42:29.462638 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:29Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:29 crc kubenswrapper[4675]: I0216 19:42:29.478036 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:29Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:29 crc kubenswrapper[4675]: I0216 19:42:29.499622 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pj5xg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9a99563-d631-455f-8464-160e5619c610\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://798b2a0b186213c744675eb2bd1a33fcb05a8df10795373dbff1f91222fa17a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s94dl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pj5xg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:29Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:29 crc kubenswrapper[4675]: I0216 19:42:29.514140 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:29 crc kubenswrapper[4675]: I0216 19:42:29.514200 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:29 crc kubenswrapper[4675]: I0216 19:42:29.514213 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:29 crc kubenswrapper[4675]: I0216 19:42:29.514233 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:29 crc kubenswrapper[4675]: I0216 19:42:29.514245 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:29Z","lastTransitionTime":"2026-02-16T19:42:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:29 crc kubenswrapper[4675]: I0216 19:42:29.519230 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://958d71da673d02ed4c067f390a0a94c9f20052f6274ec975f9a4c3034efa7aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5282559b57497e15c0bd828702b4994e8d05f7c1a81ee44e582ce3bb3b64cbb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13af49ba7abbe43edde46130d212f6d76f30bab85a63aef18a84721887e29535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e22795f3990f90b901e1ba026e5f95e72508dc865635d19ba881a291dbaed85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9296772284fdb3e0ac56ffd5e54f10e7fe1256ccb616d52f103479ebcc6e81f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ace6ee814e8e429b6ec77e88a0bb2fff35cfbfd3497e7a4d7882e11f437ce5a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33bb9f09621d4a84e0041c922049323caf0373f0ef87a05bba0d4237b8782666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4a449e69885bb506900d7dbb14ee3ff9eb2d1188267babb04fbe0bb8f057c48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gpc5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:29Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:29 crc kubenswrapper[4675]: I0216 19:42:29.532633 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10414964-83d0-4d95-a89f-e3212a8015b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db3c34dee6cb538e5de05d8703350e3013401589489756aafbf8fed2ef597d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27l4j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92e33c01bc9214841f42d4ca6e58fb062a9e94fe39bdc5ad08ad2351cd270058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27l4j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j7pnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:29Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:29 crc kubenswrapper[4675]: I0216 19:42:29.547192 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc6bac98-79c6-4970-ba8b-7996a4046f7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055f2d1052fed3e5e8d660243c3d0595e3168a670ea7a22ef51b199528ffe826\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beefddcc0b5e2e84c15063f467b9043ba9caea11fbc3f7fb8fc1ece3b8c8f75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dbd2e87a95c7964a1bafb765ec76d7f18e180cc0e47df9fde7b526b3717a107\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cc4f98152f284333d41071adaefada973e3396c3cdb2c6ee3c662f0043fe65b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c762d1facdba3325880ae6b1e2704c626ae67915808b527959795774e2a1ba62\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-16T19:42:02Z\\\",\\\"message\\\":\\\"W0216 19:42:01.228572 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0216 19:42:01.228958 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771270921 cert, and key in /tmp/serving-cert-1169484463/serving-signer.crt, /tmp/serving-cert-1169484463/serving-signer.key\\\\nI0216 19:42:01.683881 1 observer_polling.go:159] Starting file observer\\\\nW0216 19:42:01.687387 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0216 19:42:01.693158 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0216 19:42:01.694799 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1169484463/tls.crt::/tmp/serving-cert-1169484463/tls.key\\\\\\\"\\\\nF0216 19:42:02.048422 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://344f8bd7bee340cc040a2556a0e92c055ed7fbad886f85750b66d21ace766e8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:29Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:29 crc kubenswrapper[4675]: I0216 19:42:29.559198 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4vvwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2de9472b-0e23-4821-86b8-202bfd739aee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2a601bdf26e518854cf3a55f3aed6274701bc40b20fe0f0e7673e65a007f0b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqw54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4vvwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:29Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:29 crc kubenswrapper[4675]: I0216 19:42:29.578428 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rg8bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f385df3d-0543-4189-88a6-2163c8b9b959\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://993eb6f1a848a5279ad2e5d47f47c873918dbca13d2ff7db80bebbad6f948476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://993eb6f1a848a5279ad2e5d47f47c873918dbca13d2ff7db80bebbad6f948476\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02bb51b679cec1e9962a9d52b4fa890f31df7d9c00ed9f4885e94ac52455341c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02bb51b679cec1e9962a9d52b4fa890f31df7d9c00ed9f4885e94ac52455341c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0161d628c911b5e33369708f774f6f8313fb74d7fdc1266b7bfa346d5d6200a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0161d628c911b5e33369708f774f6f8313fb74d7fdc1266b7bfa346d5d6200a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://470da14b61b6a4b60e151726b8a40431c5a38f30416cb9e635e843541e8b28bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://470da14b61b6a4b60e151726b8a40431c5a38f30416cb9e635e843541e8b28bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd74b2f78bf4719b66ec38da6783f0e4ee8b93d27d6c27907212e906c1cd8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rg8bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:29Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:29 crc kubenswrapper[4675]: I0216 19:42:29.593642 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04c59a1a3ae53156b955a90113bc3d3b4424b7dcc3baa91e8256fd0893751af8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:29Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:29 crc kubenswrapper[4675]: I0216 19:42:29.610649 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"776ad1bc-395c-41c3-8c95-37a58a8cd4e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04dbf4934ecd5c1773fe7c1d032d68d59f41556af022bdf1c36f8de85221616a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abb4dfa515be84263cef6de8c300018193409908292cbfe0d19610ade9532ad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8d3160c906648c15ea910b1b2fa9223c716bbefac62b25bba4f83f3b50217b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd0dd7a8d986a0dd78568e33afa36fcd8bba88cf885d29b22484132bf0034c47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:29Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:29 crc kubenswrapper[4675]: I0216 19:42:29.617291 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:29 crc kubenswrapper[4675]: I0216 19:42:29.617335 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:29 crc kubenswrapper[4675]: I0216 19:42:29.617345 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:29 crc kubenswrapper[4675]: I0216 19:42:29.617363 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:29 crc kubenswrapper[4675]: I0216 19:42:29.617378 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:29Z","lastTransitionTime":"2026-02-16T19:42:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:29 crc kubenswrapper[4675]: I0216 19:42:29.629788 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-stxk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af599569-feb0-432a-9adf-1b72d1ac1a57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5944f34bb922394197cfb79978d19139f70724115954043828e9f609ca32945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-stxk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:29Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:29 crc kubenswrapper[4675]: I0216 19:42:29.644735 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10e2a1727464113d2cd88dfb8ba52bb1944b3dfe67a0d40f5b2e57fe483e5890\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83a4afeccd65f013b07c0f90cbb94f765f918aaaeeca716935f940deb85403d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:29Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:29 crc kubenswrapper[4675]: I0216 19:42:29.663273 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:29Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:29 crc kubenswrapper[4675]: I0216 19:42:29.690532 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"358c3bab-4666-44d1-90fb-3affa22327e2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e0dcd3707a91b9a3869942408c144f0b4ca22cd64e99de5fe43f95094e44a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d3a4d507eb459edf9f2a3c58417ae320eba85a7b597808fdb0eabe7d7d5afbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd404f7efe7c09d47f0550b7cf66ac7193f315f60db74cd5609593546037c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bfda6b7519474256ab63750ba86955592a2e47a0e364e4170a19fe28270459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e9a715621e072260b3d9703159db5eaad4918ff7a2cb7689dcb914ccd617950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e8bfd47afcf3bbd1831b98e69a3ee5fe7b49dd04d4f2efc9d9dc8fda831c74a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e8bfd47afcf3bbd1831b98e69a3ee5fe7b49dd04d4f2efc9d9dc8fda831c74a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65752e9eeab5fea26d84c7d43a33947d3dbabdd31578480904db24d60d298a0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65752e9eeab5fea26d84c7d43a33947d3dbabdd31578480904db24d60d298a0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bd389e534c9b05c81db5cfb0a9f1f5e78e2f1ccbcce15953172836fb3fb6744f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd389e534c9b05c81db5cfb0a9f1f5e78e2f1ccbcce15953172836fb3fb6744f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:29Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:29 crc kubenswrapper[4675]: I0216 19:42:29.720187 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:29 crc kubenswrapper[4675]: I0216 19:42:29.720243 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:29 crc kubenswrapper[4675]: I0216 19:42:29.720256 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:29 crc kubenswrapper[4675]: I0216 19:42:29.720275 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:29 crc kubenswrapper[4675]: I0216 19:42:29.720289 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:29Z","lastTransitionTime":"2026-02-16T19:42:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:29 crc kubenswrapper[4675]: I0216 19:42:29.804397 4675 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 10:51:57.167879318 +0000 UTC Feb 16 19:42:29 crc kubenswrapper[4675]: I0216 19:42:29.823629 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:29 crc kubenswrapper[4675]: I0216 19:42:29.823665 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:29 crc kubenswrapper[4675]: I0216 19:42:29.823674 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:29 crc kubenswrapper[4675]: I0216 19:42:29.823705 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:29 crc kubenswrapper[4675]: I0216 19:42:29.823715 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:29Z","lastTransitionTime":"2026-02-16T19:42:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:29 crc kubenswrapper[4675]: I0216 19:42:29.925851 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:29 crc kubenswrapper[4675]: I0216 19:42:29.925904 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:29 crc kubenswrapper[4675]: I0216 19:42:29.925919 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:29 crc kubenswrapper[4675]: I0216 19:42:29.925945 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:29 crc kubenswrapper[4675]: I0216 19:42:29.925959 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:29Z","lastTransitionTime":"2026-02-16T19:42:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:30 crc kubenswrapper[4675]: I0216 19:42:30.028755 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:30 crc kubenswrapper[4675]: I0216 19:42:30.028798 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:30 crc kubenswrapper[4675]: I0216 19:42:30.028810 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:30 crc kubenswrapper[4675]: I0216 19:42:30.028830 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:30 crc kubenswrapper[4675]: I0216 19:42:30.028844 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:30Z","lastTransitionTime":"2026-02-16T19:42:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:30 crc kubenswrapper[4675]: I0216 19:42:30.131753 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:30 crc kubenswrapper[4675]: I0216 19:42:30.131825 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:30 crc kubenswrapper[4675]: I0216 19:42:30.131837 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:30 crc kubenswrapper[4675]: I0216 19:42:30.131854 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:30 crc kubenswrapper[4675]: I0216 19:42:30.131866 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:30Z","lastTransitionTime":"2026-02-16T19:42:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:30 crc kubenswrapper[4675]: I0216 19:42:30.135999 4675 generic.go:334] "Generic (PLEG): container finished" podID="f385df3d-0543-4189-88a6-2163c8b9b959" containerID="fbd74b2f78bf4719b66ec38da6783f0e4ee8b93d27d6c27907212e906c1cd8e4" exitCode=0 Feb 16 19:42:30 crc kubenswrapper[4675]: I0216 19:42:30.136085 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rg8bp" event={"ID":"f385df3d-0543-4189-88a6-2163c8b9b959","Type":"ContainerDied","Data":"fbd74b2f78bf4719b66ec38da6783f0e4ee8b93d27d6c27907212e906c1cd8e4"} Feb 16 19:42:30 crc kubenswrapper[4675]: I0216 19:42:30.136162 4675 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 16 19:42:30 crc kubenswrapper[4675]: I0216 19:42:30.155773 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10e2a1727464113d2cd88dfb8ba52bb1944b3dfe67a0d40f5b2e57fe483e5890\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83a4afeccd65f013b07c0f90cbb94f765f918aaaeeca716935f940deb85403d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:30Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:30 crc kubenswrapper[4675]: I0216 19:42:30.169138 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-stxk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af599569-feb0-432a-9adf-1b72d1ac1a57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5944f34bb922394197cfb79978d19139f70724115954043828e9f609ca32945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-stxk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:30Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:30 crc kubenswrapper[4675]: I0216 19:42:30.198545 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"358c3bab-4666-44d1-90fb-3affa22327e2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e0dcd3707a91b9a3869942408c144f0b4ca22cd64e99de5fe43f95094e44a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d3a4d507eb459edf9f2a3c58417ae320eba85a7b597808fdb0eabe7d7d5afbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd404f7efe7c09d47f0550b7cf66ac7193f315f60db74cd5609593546037c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bfda6b7519474256ab63750ba86955592a2e47a0e364e4170a19fe28270459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e9a715621e072260b3d9703159db5eaad4918ff7a2cb7689dcb914ccd617950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e8bfd47afcf3bbd1831b98e69a3ee5fe7b49dd04d4f2efc9d9dc8fda831c74a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e8bfd47afcf3bbd1831b98e69a3ee5fe7b49dd04d4f2efc9d9dc8fda831c74a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65752e9eeab5fea26d84c7d43a33947d3dbabdd31578480904db24d60d298a0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65752e9eeab5fea26d84c7d43a33947d3dbabdd31578480904db24d60d298a0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bd389e534c9b05c81db5cfb0a9f1f5e78e2f1ccbcce15953172836fb3fb6744f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd389e534c9b05c81db5cfb0a9f1f5e78e2f1ccbcce15953172836fb3fb6744f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:30Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:30 crc kubenswrapper[4675]: I0216 19:42:30.212541 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:30Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:30 crc kubenswrapper[4675]: I0216 19:42:30.234235 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:30 crc kubenswrapper[4675]: I0216 19:42:30.234301 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:30 crc kubenswrapper[4675]: I0216 19:42:30.234318 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:30 crc kubenswrapper[4675]: I0216 19:42:30.234347 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:30 crc kubenswrapper[4675]: I0216 19:42:30.234375 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:30Z","lastTransitionTime":"2026-02-16T19:42:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:30 crc kubenswrapper[4675]: I0216 19:42:30.239863 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10414964-83d0-4d95-a89f-e3212a8015b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db3c34dee6cb538e5de05d8703350e3013401589489756aafbf8fed2ef597d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27l4j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92e33c01bc9214841f42d4ca6e58fb062a9e94fe39bdc5ad08ad2351cd270058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27l4j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j7pnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:30Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:30 crc kubenswrapper[4675]: I0216 19:42:30.265469 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc6bac98-79c6-4970-ba8b-7996a4046f7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055f2d1052fed3e5e8d660243c3d0595e3168a670ea7a22ef51b199528ffe826\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beefddcc0b5e2e84c15063f467b9043ba9caea11fbc3f7fb8fc1ece3b8c8f75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dbd2e87a95c7964a1bafb765ec76d7f18e180cc0e47df9fde7b526b3717a107\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cc4f98152f284333d41071adaefada973e3396c3cdb2c6ee3c662f0043fe65b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c762d1facdba3325880ae6b1e2704c626ae67915808b527959795774e2a1ba62\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-16T19:42:02Z\\\",\\\"message\\\":\\\"W0216 19:42:01.228572 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0216 19:42:01.228958 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771270921 cert, and key in /tmp/serving-cert-1169484463/serving-signer.crt, /tmp/serving-cert-1169484463/serving-signer.key\\\\nI0216 19:42:01.683881 1 observer_polling.go:159] Starting file observer\\\\nW0216 19:42:01.687387 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0216 19:42:01.693158 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0216 19:42:01.694799 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1169484463/tls.crt::/tmp/serving-cert-1169484463/tls.key\\\\\\\"\\\\nF0216 19:42:02.048422 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://344f8bd7bee340cc040a2556a0e92c055ed7fbad886f85750b66d21ace766e8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:30Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:30 crc kubenswrapper[4675]: I0216 19:42:30.283639 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18c7d7a1f00611bdd6074314992cb6601ce0c043b831b6125010fe098c7e12ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:30Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:30 crc kubenswrapper[4675]: I0216 19:42:30.298958 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:30Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:30 crc kubenswrapper[4675]: I0216 19:42:30.324911 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:30Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:30 crc kubenswrapper[4675]: I0216 19:42:30.338267 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:30 crc kubenswrapper[4675]: I0216 19:42:30.338316 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:30 crc kubenswrapper[4675]: I0216 19:42:30.338333 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:30 crc kubenswrapper[4675]: I0216 19:42:30.338359 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:30 crc kubenswrapper[4675]: I0216 19:42:30.338377 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:30Z","lastTransitionTime":"2026-02-16T19:42:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:30 crc kubenswrapper[4675]: I0216 19:42:30.363116 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pj5xg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9a99563-d631-455f-8464-160e5619c610\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://798b2a0b186213c744675eb2bd1a33fcb05a8df10795373dbff1f91222fa17a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s94dl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pj5xg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:30Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:30 crc kubenswrapper[4675]: I0216 19:42:30.401458 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://958d71da673d02ed4c067f390a0a94c9f20052f6274ec975f9a4c3034efa7aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5282559b57497e15c0bd828702b4994e8d05f7c1a81ee44e582ce3bb3b64cbb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13af49ba7abbe43edde46130d212f6d76f30bab85a63aef18a84721887e29535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e22795f3990f90b901e1ba026e5f95e72508dc865635d19ba881a291dbaed85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9296772284fdb3e0ac56ffd5e54f10e7fe1256ccb616d52f103479ebcc6e81f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ace6ee814e8e429b6ec77e88a0bb2fff35cfbfd3497e7a4d7882e11f437ce5a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33bb9f09621d4a84e0041c922049323caf0373f0ef87a05bba0d4237b8782666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4a449e69885bb506900d7dbb14ee3ff9eb2d1188267babb04fbe0bb8f057c48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gpc5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:30Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:30 crc kubenswrapper[4675]: I0216 19:42:30.441740 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rg8bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f385df3d-0543-4189-88a6-2163c8b9b959\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://993eb6f1a848a5279ad2e5d47f47c873918dbca13d2ff7db80bebbad6f948476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://993eb6f1a848a5279ad2e5d47f47c873918dbca13d2ff7db80bebbad6f948476\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02bb51b679cec1e9962a9d52b4fa890f31df7d9c00ed9f4885e94ac52455341c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02bb51b679cec1e9962a9d52b4fa890f31df7d9c00ed9f4885e94ac52455341c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0161d628c911b5e33369708f774f6f8313fb74d7fdc1266b7bfa346d5d6200a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0161d628c911b5e33369708f774f6f8313fb74d7fdc1266b7bfa346d5d6200a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://470da14b61b6a4b60e151726b8a40431c5a38f30416cb9e635e843541e8b28bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://470da14b61b6a4b60e151726b8a40431c5a38f30416cb9e635e843541e8b28bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd74b2f78bf4719b66ec38da6783f0e4ee8b93d27d6c27907212e906c1cd8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbd74b2f78bf4719b66ec38da6783f0e4ee8b93d27d6c27907212e906c1cd8e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rg8bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:30Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:30 crc kubenswrapper[4675]: I0216 19:42:30.443339 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:30 crc kubenswrapper[4675]: I0216 19:42:30.443394 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:30 crc kubenswrapper[4675]: I0216 19:42:30.443407 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:30 crc kubenswrapper[4675]: I0216 19:42:30.443431 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:30 crc kubenswrapper[4675]: I0216 19:42:30.443444 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:30Z","lastTransitionTime":"2026-02-16T19:42:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:30 crc kubenswrapper[4675]: I0216 19:42:30.455869 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4vvwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2de9472b-0e23-4821-86b8-202bfd739aee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2a601bdf26e518854cf3a55f3aed6274701bc40b20fe0f0e7673e65a007f0b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqw54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4vvwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:30Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:30 crc kubenswrapper[4675]: I0216 19:42:30.471380 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"776ad1bc-395c-41c3-8c95-37a58a8cd4e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04dbf4934ecd5c1773fe7c1d032d68d59f41556af022bdf1c36f8de85221616a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abb4dfa515be84263cef6de8c300018193409908292cbfe0d19610ade9532ad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8d3160c906648c15ea910b1b2fa9223c716bbefac62b25bba4f83f3b50217b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd0dd7a8d986a0dd78568e33afa36fcd8bba88cf885d29b22484132bf0034c47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:30Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:30 crc kubenswrapper[4675]: I0216 19:42:30.485136 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04c59a1a3ae53156b955a90113bc3d3b4424b7dcc3baa91e8256fd0893751af8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:30Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:30 crc kubenswrapper[4675]: I0216 19:42:30.546789 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:30 crc kubenswrapper[4675]: I0216 19:42:30.546863 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:30 crc kubenswrapper[4675]: I0216 19:42:30.546881 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:30 crc kubenswrapper[4675]: I0216 19:42:30.546910 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:30 crc kubenswrapper[4675]: I0216 19:42:30.546932 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:30Z","lastTransitionTime":"2026-02-16T19:42:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:30 crc kubenswrapper[4675]: I0216 19:42:30.650143 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:30 crc kubenswrapper[4675]: I0216 19:42:30.650188 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:30 crc kubenswrapper[4675]: I0216 19:42:30.650201 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:30 crc kubenswrapper[4675]: I0216 19:42:30.650230 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:30 crc kubenswrapper[4675]: I0216 19:42:30.650246 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:30Z","lastTransitionTime":"2026-02-16T19:42:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:30 crc kubenswrapper[4675]: I0216 19:42:30.752953 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:30 crc kubenswrapper[4675]: I0216 19:42:30.752986 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:30 crc kubenswrapper[4675]: I0216 19:42:30.752994 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:30 crc kubenswrapper[4675]: I0216 19:42:30.753009 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:30 crc kubenswrapper[4675]: I0216 19:42:30.753018 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:30Z","lastTransitionTime":"2026-02-16T19:42:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:30 crc kubenswrapper[4675]: I0216 19:42:30.805317 4675 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 17:33:08.929031768 +0000 UTC Feb 16 19:42:30 crc kubenswrapper[4675]: I0216 19:42:30.855996 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:30 crc kubenswrapper[4675]: I0216 19:42:30.856041 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:30 crc kubenswrapper[4675]: I0216 19:42:30.856050 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:30 crc kubenswrapper[4675]: I0216 19:42:30.856068 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:30 crc kubenswrapper[4675]: I0216 19:42:30.856080 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:30Z","lastTransitionTime":"2026-02-16T19:42:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:30 crc kubenswrapper[4675]: I0216 19:42:30.883475 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 19:42:30 crc kubenswrapper[4675]: I0216 19:42:30.883474 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 19:42:30 crc kubenswrapper[4675]: E0216 19:42:30.883599 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 19:42:30 crc kubenswrapper[4675]: E0216 19:42:30.883662 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 19:42:30 crc kubenswrapper[4675]: I0216 19:42:30.883474 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 19:42:30 crc kubenswrapper[4675]: E0216 19:42:30.883739 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 19:42:30 crc kubenswrapper[4675]: I0216 19:42:30.965303 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:30 crc kubenswrapper[4675]: I0216 19:42:30.965336 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:30 crc kubenswrapper[4675]: I0216 19:42:30.965345 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:30 crc kubenswrapper[4675]: I0216 19:42:30.965362 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:30 crc kubenswrapper[4675]: I0216 19:42:30.965371 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:30Z","lastTransitionTime":"2026-02-16T19:42:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:31 crc kubenswrapper[4675]: I0216 19:42:31.069466 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:31 crc kubenswrapper[4675]: I0216 19:42:31.069498 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:31 crc kubenswrapper[4675]: I0216 19:42:31.069507 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:31 crc kubenswrapper[4675]: I0216 19:42:31.069523 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:31 crc kubenswrapper[4675]: I0216 19:42:31.069535 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:31Z","lastTransitionTime":"2026-02-16T19:42:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:31 crc kubenswrapper[4675]: I0216 19:42:31.149630 4675 generic.go:334] "Generic (PLEG): container finished" podID="f385df3d-0543-4189-88a6-2163c8b9b959" containerID="f742c6f36d6a537e143339eaa367ff65398f680da2e175aeaed1a4da1886148c" exitCode=0 Feb 16 19:42:31 crc kubenswrapper[4675]: I0216 19:42:31.149834 4675 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 16 19:42:31 crc kubenswrapper[4675]: I0216 19:42:31.151356 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rg8bp" event={"ID":"f385df3d-0543-4189-88a6-2163c8b9b959","Type":"ContainerDied","Data":"f742c6f36d6a537e143339eaa367ff65398f680da2e175aeaed1a4da1886148c"} Feb 16 19:42:31 crc kubenswrapper[4675]: I0216 19:42:31.173014 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:31 crc kubenswrapper[4675]: I0216 19:42:31.173078 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:31 crc kubenswrapper[4675]: I0216 19:42:31.173100 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:31 crc kubenswrapper[4675]: I0216 19:42:31.173135 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:31 crc kubenswrapper[4675]: I0216 19:42:31.173160 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:31Z","lastTransitionTime":"2026-02-16T19:42:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:31 crc kubenswrapper[4675]: I0216 19:42:31.186359 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"358c3bab-4666-44d1-90fb-3affa22327e2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e0dcd3707a91b9a3869942408c144f0b4ca22cd64e99de5fe43f95094e44a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d3a4d507eb459edf9f2a3c58417ae320eba85a7b597808fdb0eabe7d7d5afbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd404f7efe7c09d47f0550b7cf66ac7193f315f60db74cd5609593546037c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bfda6b7519474256ab63750ba86955592a2e47a0e364e4170a19fe28270459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e9a715621e072260b3d9703159db5eaad4918ff7a2cb7689dcb914ccd617950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e8bfd47afcf3bbd1831b98e69a3ee5fe7b49dd04d4f2efc9d9dc8fda831c74a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e8bfd47afcf3bbd1831b98e69a3ee5fe7b49dd04d4f2efc9d9dc8fda831c74a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65752e9eeab5fea26d84c7d43a33947d3dbabdd31578480904db24d60d298a0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65752e9eeab5fea26d84c7d43a33947d3dbabdd31578480904db24d60d298a0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bd389e534c9b05c81db5cfb0a9f1f5e78e2f1ccbcce15953172836fb3fb6744f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd389e534c9b05c81db5cfb0a9f1f5e78e2f1ccbcce15953172836fb3fb6744f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:31Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:31 crc kubenswrapper[4675]: I0216 19:42:31.206885 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:31Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:31 crc kubenswrapper[4675]: I0216 19:42:31.233278 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://958d71da673d02ed4c067f390a0a94c9f20052f6274ec975f9a4c3034efa7aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5282559b57497e15c0bd828702b4994e8d05f7c1a81ee44e582ce3bb3b64cbb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13af49ba7abbe43edde46130d212f6d76f30bab85a63aef18a84721887e29535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e22795f3990f90b901e1ba026e5f95e72508dc865635d19ba881a291dbaed85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9296772284fdb3e0ac56ffd5e54f10e7fe1256ccb616d52f103479ebcc6e81f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ace6ee814e8e429b6ec77e88a0bb2fff35cfbfd3497e7a4d7882e11f437ce5a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33bb9f09621d4a84e0041c922049323caf0373f0ef87a05bba0d4237b8782666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4a449e69885bb506900d7dbb14ee3ff9eb2d1188267babb04fbe0bb8f057c48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gpc5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:31Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:31 crc kubenswrapper[4675]: I0216 19:42:31.248031 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10414964-83d0-4d95-a89f-e3212a8015b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db3c34dee6cb538e5de05d8703350e3013401589489756aafbf8fed2ef597d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27l4j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92e33c01bc9214841f42d4ca6e58fb062a9e94fe39bdc5ad08ad2351cd270058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27l4j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j7pnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:31Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:31 crc kubenswrapper[4675]: I0216 19:42:31.265665 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc6bac98-79c6-4970-ba8b-7996a4046f7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055f2d1052fed3e5e8d660243c3d0595e3168a670ea7a22ef51b199528ffe826\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beefddcc0b5e2e84c15063f467b9043ba9caea11fbc3f7fb8fc1ece3b8c8f75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dbd2e87a95c7964a1bafb765ec76d7f18e180cc0e47df9fde7b526b3717a107\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cc4f98152f284333d41071adaefada973e3396c3cdb2c6ee3c662f0043fe65b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c762d1facdba3325880ae6b1e2704c626ae67915808b527959795774e2a1ba62\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-16T19:42:02Z\\\",\\\"message\\\":\\\"W0216 19:42:01.228572 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0216 19:42:01.228958 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771270921 cert, and key in /tmp/serving-cert-1169484463/serving-signer.crt, /tmp/serving-cert-1169484463/serving-signer.key\\\\nI0216 19:42:01.683881 1 observer_polling.go:159] Starting file observer\\\\nW0216 19:42:01.687387 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0216 19:42:01.693158 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0216 19:42:01.694799 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1169484463/tls.crt::/tmp/serving-cert-1169484463/tls.key\\\\\\\"\\\\nF0216 19:42:02.048422 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://344f8bd7bee340cc040a2556a0e92c055ed7fbad886f85750b66d21ace766e8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:31Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:31 crc kubenswrapper[4675]: I0216 19:42:31.278875 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:31 crc kubenswrapper[4675]: I0216 19:42:31.278916 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:31 crc kubenswrapper[4675]: I0216 19:42:31.278926 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:31 crc kubenswrapper[4675]: I0216 19:42:31.278946 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:31 crc kubenswrapper[4675]: I0216 19:42:31.278959 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:31Z","lastTransitionTime":"2026-02-16T19:42:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:31 crc kubenswrapper[4675]: I0216 19:42:31.296232 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18c7d7a1f00611bdd6074314992cb6601ce0c043b831b6125010fe098c7e12ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:31Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:31 crc kubenswrapper[4675]: I0216 19:42:31.317779 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:31Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:31 crc kubenswrapper[4675]: I0216 19:42:31.342175 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:31Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:31 crc kubenswrapper[4675]: I0216 19:42:31.364038 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pj5xg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9a99563-d631-455f-8464-160e5619c610\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://798b2a0b186213c744675eb2bd1a33fcb05a8df10795373dbff1f91222fa17a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s94dl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pj5xg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:31Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:31 crc kubenswrapper[4675]: I0216 19:42:31.381338 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:31 crc kubenswrapper[4675]: I0216 19:42:31.381375 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:31 crc kubenswrapper[4675]: I0216 19:42:31.381385 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:31 crc kubenswrapper[4675]: I0216 19:42:31.381402 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:31 crc kubenswrapper[4675]: I0216 19:42:31.381411 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:31Z","lastTransitionTime":"2026-02-16T19:42:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:31 crc kubenswrapper[4675]: I0216 19:42:31.386226 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rg8bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f385df3d-0543-4189-88a6-2163c8b9b959\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://993eb6f1a848a5279ad2e5d47f47c873918dbca13d2ff7db80bebbad6f948476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://993eb6f1a848a5279ad2e5d47f47c873918dbca13d2ff7db80bebbad6f948476\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02bb51b679cec1e9962a9d52b4fa890f31df7d9c00ed9f4885e94ac52455341c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02bb51b679cec1e9962a9d52b4fa890f31df7d9c00ed9f4885e94ac52455341c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0161d628c911b5e33369708f774f6f8313fb74d7fdc1266b7bfa346d5d6200a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0161d628c911b5e33369708f774f6f8313fb74d7fdc1266b7bfa346d5d6200a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://470da14b61b6a4b60e151726b8a40431c5a38f30416cb9e635e843541e8b28bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://470da14b61b6a4b60e151726b8a40431c5a38f30416cb9e635e843541e8b28bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd74b2f78bf4719b66ec38da6783f0e4ee8b93d27d6c27907212e906c1cd8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbd74b2f78bf4719b66ec38da6783f0e4ee8b93d27d6c27907212e906c1cd8e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f742c6f36d6a537e143339eaa367ff65398f680da2e175aeaed1a4da1886148c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f742c6f36d6a537e143339eaa367ff65398f680da2e175aeaed1a4da1886148c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rg8bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:31Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:31 crc kubenswrapper[4675]: I0216 19:42:31.400277 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4vvwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2de9472b-0e23-4821-86b8-202bfd739aee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2a601bdf26e518854cf3a55f3aed6274701bc40b20fe0f0e7673e65a007f0b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqw54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4vvwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:31Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:31 crc kubenswrapper[4675]: I0216 19:42:31.417417 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"776ad1bc-395c-41c3-8c95-37a58a8cd4e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04dbf4934ecd5c1773fe7c1d032d68d59f41556af022bdf1c36f8de85221616a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abb4dfa515be84263cef6de8c300018193409908292cbfe0d19610ade9532ad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8d3160c906648c15ea910b1b2fa9223c716bbefac62b25bba4f83f3b50217b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd0dd7a8d986a0dd78568e33afa36fcd8bba88cf885d29b22484132bf0034c47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:31Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:31 crc kubenswrapper[4675]: I0216 19:42:31.433489 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04c59a1a3ae53156b955a90113bc3d3b4424b7dcc3baa91e8256fd0893751af8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:31Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:31 crc kubenswrapper[4675]: I0216 19:42:31.450124 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10e2a1727464113d2cd88dfb8ba52bb1944b3dfe67a0d40f5b2e57fe483e5890\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83a4afeccd65f013b07c0f90cbb94f765f918aaaeeca716935f940deb85403d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:31Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:31 crc kubenswrapper[4675]: I0216 19:42:31.462842 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-stxk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af599569-feb0-432a-9adf-1b72d1ac1a57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5944f34bb922394197cfb79978d19139f70724115954043828e9f609ca32945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-stxk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:31Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:31 crc kubenswrapper[4675]: I0216 19:42:31.483892 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:31 crc kubenswrapper[4675]: I0216 19:42:31.484164 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:31 crc kubenswrapper[4675]: I0216 19:42:31.484179 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:31 crc kubenswrapper[4675]: I0216 19:42:31.484214 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:31 crc kubenswrapper[4675]: I0216 19:42:31.484234 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:31Z","lastTransitionTime":"2026-02-16T19:42:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:31 crc kubenswrapper[4675]: I0216 19:42:31.586905 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:31 crc kubenswrapper[4675]: I0216 19:42:31.586950 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:31 crc kubenswrapper[4675]: I0216 19:42:31.586963 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:31 crc kubenswrapper[4675]: I0216 19:42:31.586983 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:31 crc kubenswrapper[4675]: I0216 19:42:31.586999 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:31Z","lastTransitionTime":"2026-02-16T19:42:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:31 crc kubenswrapper[4675]: I0216 19:42:31.690015 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:31 crc kubenswrapper[4675]: I0216 19:42:31.690094 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:31 crc kubenswrapper[4675]: I0216 19:42:31.690103 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:31 crc kubenswrapper[4675]: I0216 19:42:31.690123 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:31 crc kubenswrapper[4675]: I0216 19:42:31.690135 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:31Z","lastTransitionTime":"2026-02-16T19:42:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:31 crc kubenswrapper[4675]: I0216 19:42:31.793294 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:31 crc kubenswrapper[4675]: I0216 19:42:31.793349 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:31 crc kubenswrapper[4675]: I0216 19:42:31.793362 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:31 crc kubenswrapper[4675]: I0216 19:42:31.793383 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:31 crc kubenswrapper[4675]: I0216 19:42:31.793397 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:31Z","lastTransitionTime":"2026-02-16T19:42:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:31 crc kubenswrapper[4675]: I0216 19:42:31.805440 4675 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 08:42:04.470033968 +0000 UTC Feb 16 19:42:31 crc kubenswrapper[4675]: I0216 19:42:31.895740 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:31 crc kubenswrapper[4675]: I0216 19:42:31.895778 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:31 crc kubenswrapper[4675]: I0216 19:42:31.895787 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:31 crc kubenswrapper[4675]: I0216 19:42:31.895804 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:31 crc kubenswrapper[4675]: I0216 19:42:31.895814 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:31Z","lastTransitionTime":"2026-02-16T19:42:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:31 crc kubenswrapper[4675]: I0216 19:42:31.998011 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:31 crc kubenswrapper[4675]: I0216 19:42:31.998069 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:31 crc kubenswrapper[4675]: I0216 19:42:31.998083 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:31 crc kubenswrapper[4675]: I0216 19:42:31.998111 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:31 crc kubenswrapper[4675]: I0216 19:42:31.998126 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:31Z","lastTransitionTime":"2026-02-16T19:42:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:32 crc kubenswrapper[4675]: I0216 19:42:32.101166 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:32 crc kubenswrapper[4675]: I0216 19:42:32.101213 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:32 crc kubenswrapper[4675]: I0216 19:42:32.101231 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:32 crc kubenswrapper[4675]: I0216 19:42:32.101252 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:32 crc kubenswrapper[4675]: I0216 19:42:32.101266 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:32Z","lastTransitionTime":"2026-02-16T19:42:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:32 crc kubenswrapper[4675]: I0216 19:42:32.157135 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rg8bp" event={"ID":"f385df3d-0543-4189-88a6-2163c8b9b959","Type":"ContainerStarted","Data":"f034148fa1b45c35dd6c8b9b70c08b1d745d1377335d7f6d2b35793c29ce9e03"} Feb 16 19:42:32 crc kubenswrapper[4675]: I0216 19:42:32.180827 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"358c3bab-4666-44d1-90fb-3affa22327e2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e0dcd3707a91b9a3869942408c144f0b4ca22cd64e99de5fe43f95094e44a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d3a4d507eb459edf9f2a3c58417ae320eba85a7b597808fdb0eabe7d7d5afbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd404f7efe7c09d47f0550b7cf66ac7193f315f60db74cd5609593546037c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bfda6b7519474256ab63750ba86955592a2e47a0e364e4170a19fe28270459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e9a715621e072260b3d9703159db5eaad4918ff7a2cb7689dcb914ccd617950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e8bfd47afcf3bbd1831b98e69a3ee5fe7b49dd04d4f2efc9d9dc8fda831c74a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e8bfd47afcf3bbd1831b98e69a3ee5fe7b49dd04d4f2efc9d9dc8fda831c74a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65752e9eeab5fea26d84c7d43a33947d3dbabdd31578480904db24d60d298a0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65752e9eeab5fea26d84c7d43a33947d3dbabdd31578480904db24d60d298a0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bd389e534c9b05c81db5cfb0a9f1f5e78e2f1ccbcce15953172836fb3fb6744f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd389e534c9b05c81db5cfb0a9f1f5e78e2f1ccbcce15953172836fb3fb6744f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:32Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:32 crc kubenswrapper[4675]: I0216 19:42:32.200415 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:32Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:32 crc kubenswrapper[4675]: I0216 19:42:32.204803 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:32 crc kubenswrapper[4675]: I0216 19:42:32.204863 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:32 crc kubenswrapper[4675]: I0216 19:42:32.204876 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:32 crc kubenswrapper[4675]: I0216 19:42:32.204897 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:32 crc kubenswrapper[4675]: I0216 19:42:32.204909 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:32Z","lastTransitionTime":"2026-02-16T19:42:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:32 crc kubenswrapper[4675]: I0216 19:42:32.215574 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc6bac98-79c6-4970-ba8b-7996a4046f7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055f2d1052fed3e5e8d660243c3d0595e3168a670ea7a22ef51b199528ffe826\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beefddcc0b5e2e84c15063f467b9043ba9caea11fbc3f7fb8fc1ece3b8c8f75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dbd2e87a95c7964a1bafb765ec76d7f18e180cc0e47df9fde7b526b3717a107\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cc4f98152f284333d41071adaefada973e3396c3cdb2c6ee3c662f0043fe65b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c762d1facdba3325880ae6b1e2704c626ae67915808b527959795774e2a1ba62\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-16T19:42:02Z\\\",\\\"message\\\":\\\"W0216 19:42:01.228572 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0216 19:42:01.228958 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771270921 cert, and key in /tmp/serving-cert-1169484463/serving-signer.crt, /tmp/serving-cert-1169484463/serving-signer.key\\\\nI0216 19:42:01.683881 1 observer_polling.go:159] Starting file observer\\\\nW0216 19:42:01.687387 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0216 19:42:01.693158 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0216 19:42:01.694799 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1169484463/tls.crt::/tmp/serving-cert-1169484463/tls.key\\\\\\\"\\\\nF0216 19:42:02.048422 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://344f8bd7bee340cc040a2556a0e92c055ed7fbad886f85750b66d21ace766e8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:32Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:32 crc kubenswrapper[4675]: I0216 19:42:32.229523 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18c7d7a1f00611bdd6074314992cb6601ce0c043b831b6125010fe098c7e12ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:32Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:32 crc kubenswrapper[4675]: I0216 19:42:32.243369 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:32Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:32 crc kubenswrapper[4675]: I0216 19:42:32.259042 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:32Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:32 crc kubenswrapper[4675]: I0216 19:42:32.277235 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pj5xg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9a99563-d631-455f-8464-160e5619c610\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://798b2a0b186213c744675eb2bd1a33fcb05a8df10795373dbff1f91222fa17a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s94dl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pj5xg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:32Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:32 crc kubenswrapper[4675]: I0216 19:42:32.303186 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://958d71da673d02ed4c067f390a0a94c9f20052f6274ec975f9a4c3034efa7aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5282559b57497e15c0bd828702b4994e8d05f7c1a81ee44e582ce3bb3b64cbb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13af49ba7abbe43edde46130d212f6d76f30bab85a63aef18a84721887e29535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e22795f3990f90b901e1ba026e5f95e72508dc865635d19ba881a291dbaed85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9296772284fdb3e0ac56ffd5e54f10e7fe1256ccb616d52f103479ebcc6e81f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ace6ee814e8e429b6ec77e88a0bb2fff35cfbfd3497e7a4d7882e11f437ce5a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33bb9f09621d4a84e0041c922049323caf0373f0ef87a05bba0d4237b8782666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4a449e69885bb506900d7dbb14ee3ff9eb2d1188267babb04fbe0bb8f057c48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gpc5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:32Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:32 crc kubenswrapper[4675]: I0216 19:42:32.307427 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:32 crc kubenswrapper[4675]: I0216 19:42:32.307493 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:32 crc kubenswrapper[4675]: I0216 19:42:32.307506 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:32 crc kubenswrapper[4675]: I0216 19:42:32.307526 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:32 crc kubenswrapper[4675]: I0216 19:42:32.307540 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:32Z","lastTransitionTime":"2026-02-16T19:42:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:32 crc kubenswrapper[4675]: I0216 19:42:32.315351 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10414964-83d0-4d95-a89f-e3212a8015b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db3c34dee6cb538e5de05d8703350e3013401589489756aafbf8fed2ef597d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27l4j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92e33c01bc9214841f42d4ca6e58fb062a9e94fe39bdc5ad08ad2351cd270058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27l4j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j7pnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:32Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:32 crc kubenswrapper[4675]: I0216 19:42:32.332166 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rg8bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f385df3d-0543-4189-88a6-2163c8b9b959\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f034148fa1b45c35dd6c8b9b70c08b1d745d1377335d7f6d2b35793c29ce9e03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://993eb6f1a848a5279ad2e5d47f47c873918dbca13d2ff7db80bebbad6f948476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://993eb6f1a848a5279ad2e5d47f47c873918dbca13d2ff7db80bebbad6f948476\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02bb51b679cec1e9962a9d52b4fa890f31df7d9c00ed9f4885e94ac52455341c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02bb51b679cec1e9962a9d52b4fa890f31df7d9c00ed9f4885e94ac52455341c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0161d628c911b5e33369708f774f6f8313fb74d7fdc1266b7bfa346d5d6200a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0161d628c911b5e33369708f774f6f8313fb74d7fdc1266b7bfa346d5d6200a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://470da14b61b6a4b60e151726b8a40431c5a38f30416cb9e635e843541e8b28bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://470da14b61b6a4b60e151726b8a40431c5a38f30416cb9e635e843541e8b28bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd74b2f78bf4719b66ec38da6783f0e4ee8b93d27d6c27907212e906c1cd8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbd74b2f78bf4719b66ec38da6783f0e4ee8b93d27d6c27907212e906c1cd8e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f742c6f36d6a537e143339eaa367ff65398f680da2e175aeaed1a4da1886148c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f742c6f36d6a537e143339eaa367ff65398f680da2e175aeaed1a4da1886148c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rg8bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:32Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:32 crc kubenswrapper[4675]: I0216 19:42:32.347548 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4vvwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2de9472b-0e23-4821-86b8-202bfd739aee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2a601bdf26e518854cf3a55f3aed6274701bc40b20fe0f0e7673e65a007f0b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqw54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4vvwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:32Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:32 crc kubenswrapper[4675]: I0216 19:42:32.367190 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"776ad1bc-395c-41c3-8c95-37a58a8cd4e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04dbf4934ecd5c1773fe7c1d032d68d59f41556af022bdf1c36f8de85221616a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abb4dfa515be84263cef6de8c300018193409908292cbfe0d19610ade9532ad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8d3160c906648c15ea910b1b2fa9223c716bbefac62b25bba4f83f3b50217b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd0dd7a8d986a0dd78568e33afa36fcd8bba88cf885d29b22484132bf0034c47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:32Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:32 crc kubenswrapper[4675]: I0216 19:42:32.385089 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04c59a1a3ae53156b955a90113bc3d3b4424b7dcc3baa91e8256fd0893751af8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:32Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:32 crc kubenswrapper[4675]: I0216 19:42:32.400112 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10e2a1727464113d2cd88dfb8ba52bb1944b3dfe67a0d40f5b2e57fe483e5890\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83a4afeccd65f013b07c0f90cbb94f765f918aaaeeca716935f940deb85403d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:32Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:32 crc kubenswrapper[4675]: I0216 19:42:32.410461 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:32 crc kubenswrapper[4675]: I0216 19:42:32.410495 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:32 crc kubenswrapper[4675]: I0216 19:42:32.410504 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:32 crc kubenswrapper[4675]: I0216 19:42:32.410522 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:32 crc kubenswrapper[4675]: I0216 19:42:32.410532 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:32Z","lastTransitionTime":"2026-02-16T19:42:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:32 crc kubenswrapper[4675]: I0216 19:42:32.412066 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-stxk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af599569-feb0-432a-9adf-1b72d1ac1a57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5944f34bb922394197cfb79978d19139f70724115954043828e9f609ca32945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-stxk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:32Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:32 crc kubenswrapper[4675]: I0216 19:42:32.512804 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:32 crc kubenswrapper[4675]: I0216 19:42:32.512859 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:32 crc kubenswrapper[4675]: I0216 19:42:32.512869 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:32 crc kubenswrapper[4675]: I0216 19:42:32.512893 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:32 crc kubenswrapper[4675]: I0216 19:42:32.512909 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:32Z","lastTransitionTime":"2026-02-16T19:42:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:32 crc kubenswrapper[4675]: I0216 19:42:32.616354 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:32 crc kubenswrapper[4675]: I0216 19:42:32.616402 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:32 crc kubenswrapper[4675]: I0216 19:42:32.616415 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:32 crc kubenswrapper[4675]: I0216 19:42:32.616436 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:32 crc kubenswrapper[4675]: I0216 19:42:32.616449 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:32Z","lastTransitionTime":"2026-02-16T19:42:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:32 crc kubenswrapper[4675]: I0216 19:42:32.719469 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:32 crc kubenswrapper[4675]: I0216 19:42:32.719532 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:32 crc kubenswrapper[4675]: I0216 19:42:32.719547 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:32 crc kubenswrapper[4675]: I0216 19:42:32.719568 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:32 crc kubenswrapper[4675]: I0216 19:42:32.719581 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:32Z","lastTransitionTime":"2026-02-16T19:42:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:32 crc kubenswrapper[4675]: I0216 19:42:32.806105 4675 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 10:28:32.851355907 +0000 UTC Feb 16 19:42:32 crc kubenswrapper[4675]: I0216 19:42:32.822644 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:32 crc kubenswrapper[4675]: I0216 19:42:32.822737 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:32 crc kubenswrapper[4675]: I0216 19:42:32.822757 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:32 crc kubenswrapper[4675]: I0216 19:42:32.822787 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:32 crc kubenswrapper[4675]: I0216 19:42:32.822805 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:32Z","lastTransitionTime":"2026-02-16T19:42:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:32 crc kubenswrapper[4675]: I0216 19:42:32.883315 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 19:42:32 crc kubenswrapper[4675]: E0216 19:42:32.883703 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 19:42:32 crc kubenswrapper[4675]: I0216 19:42:32.883342 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 19:42:32 crc kubenswrapper[4675]: E0216 19:42:32.883986 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 19:42:32 crc kubenswrapper[4675]: I0216 19:42:32.883315 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 19:42:32 crc kubenswrapper[4675]: E0216 19:42:32.884191 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 19:42:32 crc kubenswrapper[4675]: I0216 19:42:32.925253 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:32 crc kubenswrapper[4675]: I0216 19:42:32.925479 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:32 crc kubenswrapper[4675]: I0216 19:42:32.925582 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:32 crc kubenswrapper[4675]: I0216 19:42:32.925743 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:32 crc kubenswrapper[4675]: I0216 19:42:32.925827 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:32Z","lastTransitionTime":"2026-02-16T19:42:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:33 crc kubenswrapper[4675]: I0216 19:42:33.028467 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:33 crc kubenswrapper[4675]: I0216 19:42:33.028506 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:33 crc kubenswrapper[4675]: I0216 19:42:33.028515 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:33 crc kubenswrapper[4675]: I0216 19:42:33.028531 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:33 crc kubenswrapper[4675]: I0216 19:42:33.028543 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:33Z","lastTransitionTime":"2026-02-16T19:42:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:33 crc kubenswrapper[4675]: I0216 19:42:33.131248 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:33 crc kubenswrapper[4675]: I0216 19:42:33.131475 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:33 crc kubenswrapper[4675]: I0216 19:42:33.131565 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:33 crc kubenswrapper[4675]: I0216 19:42:33.131634 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:33 crc kubenswrapper[4675]: I0216 19:42:33.131714 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:33Z","lastTransitionTime":"2026-02-16T19:42:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:33 crc kubenswrapper[4675]: I0216 19:42:33.162262 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gpc5z_9b6e2d5a-0472-425b-b5b4-0b94f14ebfba/ovnkube-controller/0.log" Feb 16 19:42:33 crc kubenswrapper[4675]: I0216 19:42:33.164954 4675 generic.go:334] "Generic (PLEG): container finished" podID="9b6e2d5a-0472-425b-b5b4-0b94f14ebfba" containerID="33bb9f09621d4a84e0041c922049323caf0373f0ef87a05bba0d4237b8782666" exitCode=1 Feb 16 19:42:33 crc kubenswrapper[4675]: I0216 19:42:33.165552 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" event={"ID":"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba","Type":"ContainerDied","Data":"33bb9f09621d4a84e0041c922049323caf0373f0ef87a05bba0d4237b8782666"} Feb 16 19:42:33 crc kubenswrapper[4675]: I0216 19:42:33.166069 4675 scope.go:117] "RemoveContainer" containerID="33bb9f09621d4a84e0041c922049323caf0373f0ef87a05bba0d4237b8782666" Feb 16 19:42:33 crc kubenswrapper[4675]: I0216 19:42:33.181740 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10414964-83d0-4d95-a89f-e3212a8015b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db3c34dee6cb538e5de05d8703350e3013401589489756aafbf8fed2ef597d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27l4j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92e33c01bc9214841f42d4ca6e58fb062a9e94fe39bdc5ad08ad2351cd270058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27l4j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j7pnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:33Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:33 crc kubenswrapper[4675]: I0216 19:42:33.206813 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc6bac98-79c6-4970-ba8b-7996a4046f7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055f2d1052fed3e5e8d660243c3d0595e3168a670ea7a22ef51b199528ffe826\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beefddcc0b5e2e84c15063f467b9043ba9caea11fbc3f7fb8fc1ece3b8c8f75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dbd2e87a95c7964a1bafb765ec76d7f18e180cc0e47df9fde7b526b3717a107\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cc4f98152f284333d41071adaefada973e3396c3cdb2c6ee3c662f0043fe65b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c762d1facdba3325880ae6b1e2704c626ae67915808b527959795774e2a1ba62\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-16T19:42:02Z\\\",\\\"message\\\":\\\"W0216 19:42:01.228572 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0216 19:42:01.228958 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771270921 cert, and key in /tmp/serving-cert-1169484463/serving-signer.crt, /tmp/serving-cert-1169484463/serving-signer.key\\\\nI0216 19:42:01.683881 1 observer_polling.go:159] Starting file observer\\\\nW0216 19:42:01.687387 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0216 19:42:01.693158 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0216 19:42:01.694799 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1169484463/tls.crt::/tmp/serving-cert-1169484463/tls.key\\\\\\\"\\\\nF0216 19:42:02.048422 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://344f8bd7bee340cc040a2556a0e92c055ed7fbad886f85750b66d21ace766e8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:33Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:33 crc kubenswrapper[4675]: I0216 19:42:33.224041 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18c7d7a1f00611bdd6074314992cb6601ce0c043b831b6125010fe098c7e12ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:33Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:33 crc kubenswrapper[4675]: I0216 19:42:33.235138 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:33 crc kubenswrapper[4675]: I0216 19:42:33.235188 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:33 crc kubenswrapper[4675]: I0216 19:42:33.235201 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:33 crc kubenswrapper[4675]: I0216 19:42:33.235219 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:33 crc kubenswrapper[4675]: I0216 19:42:33.235231 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:33Z","lastTransitionTime":"2026-02-16T19:42:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:33 crc kubenswrapper[4675]: I0216 19:42:33.241915 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:33Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:33 crc kubenswrapper[4675]: I0216 19:42:33.254480 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:33Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:33 crc kubenswrapper[4675]: I0216 19:42:33.270340 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pj5xg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9a99563-d631-455f-8464-160e5619c610\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://798b2a0b186213c744675eb2bd1a33fcb05a8df10795373dbff1f91222fa17a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s94dl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pj5xg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:33Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:33 crc kubenswrapper[4675]: I0216 19:42:33.297802 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://958d71da673d02ed4c067f390a0a94c9f20052f6274ec975f9a4c3034efa7aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5282559b57497e15c0bd828702b4994e8d05f7c1a81ee44e582ce3bb3b64cbb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13af49ba7abbe43edde46130d212f6d76f30bab85a63aef18a84721887e29535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e22795f3990f90b901e1ba026e5f95e72508dc865635d19ba881a291dbaed85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9296772284fdb3e0ac56ffd5e54f10e7fe1256ccb616d52f103479ebcc6e81f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ace6ee814e8e429b6ec77e88a0bb2fff35cfbfd3497e7a4d7882e11f437ce5a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33bb9f09621d4a84e0041c922049323caf0373f0ef87a05bba0d4237b8782666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33bb9f09621d4a84e0041c922049323caf0373f0ef87a05bba0d4237b8782666\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-16T19:42:32Z\\\",\\\"message\\\":\\\"9:42:32.347198 5892 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0216 19:42:32.347212 5892 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0216 19:42:32.347197 5892 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0216 19:42:32.347290 5892 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0216 19:42:32.347359 5892 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0216 19:42:32.347440 5892 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0216 19:42:32.347462 5892 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0216 19:42:32.347442 5892 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0216 19:42:32.347508 5892 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0216 19:42:32.347556 5892 handler.go:208] Removed *v1.Node event handler 7\\\\nI0216 19:42:32.347596 5892 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0216 19:42:32.347620 5892 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0216 19:42:32.347640 5892 handler.go:208] Removed *v1.Node event handler 2\\\\nI0216 19:42:32.347664 5892 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0216 19:42:32.347716 5892 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0216 19:42:32.347718 5892 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4a449e69885bb506900d7dbb14ee3ff9eb2d1188267babb04fbe0bb8f057c48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gpc5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:33Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:33 crc kubenswrapper[4675]: I0216 19:42:33.316042 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rg8bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f385df3d-0543-4189-88a6-2163c8b9b959\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f034148fa1b45c35dd6c8b9b70c08b1d745d1377335d7f6d2b35793c29ce9e03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://993eb6f1a848a5279ad2e5d47f47c873918dbca13d2ff7db80bebbad6f948476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://993eb6f1a848a5279ad2e5d47f47c873918dbca13d2ff7db80bebbad6f948476\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02bb51b679cec1e9962a9d52b4fa890f31df7d9c00ed9f4885e94ac52455341c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02bb51b679cec1e9962a9d52b4fa890f31df7d9c00ed9f4885e94ac52455341c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0161d628c911b5e33369708f774f6f8313fb74d7fdc1266b7bfa346d5d6200a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0161d628c911b5e33369708f774f6f8313fb74d7fdc1266b7bfa346d5d6200a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://470da14b61b6a4b60e151726b8a40431c5a38f30416cb9e635e843541e8b28bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://470da14b61b6a4b60e151726b8a40431c5a38f30416cb9e635e843541e8b28bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd74b2f78bf4719b66ec38da6783f0e4ee8b93d27d6c27907212e906c1cd8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbd74b2f78bf4719b66ec38da6783f0e4ee8b93d27d6c27907212e906c1cd8e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f742c6f36d6a537e143339eaa367ff65398f680da2e175aeaed1a4da1886148c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f742c6f36d6a537e143339eaa367ff65398f680da2e175aeaed1a4da1886148c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rg8bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:33Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:33 crc kubenswrapper[4675]: I0216 19:42:33.326900 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4vvwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2de9472b-0e23-4821-86b8-202bfd739aee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2a601bdf26e518854cf3a55f3aed6274701bc40b20fe0f0e7673e65a007f0b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqw54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4vvwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:33Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:33 crc kubenswrapper[4675]: I0216 19:42:33.337611 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:33 crc kubenswrapper[4675]: I0216 19:42:33.337672 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:33 crc kubenswrapper[4675]: I0216 19:42:33.337703 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:33 crc kubenswrapper[4675]: I0216 19:42:33.337727 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:33 crc kubenswrapper[4675]: I0216 19:42:33.337740 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:33Z","lastTransitionTime":"2026-02-16T19:42:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:33 crc kubenswrapper[4675]: I0216 19:42:33.340416 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"776ad1bc-395c-41c3-8c95-37a58a8cd4e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04dbf4934ecd5c1773fe7c1d032d68d59f41556af022bdf1c36f8de85221616a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abb4dfa515be84263cef6de8c300018193409908292cbfe0d19610ade9532ad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8d3160c906648c15ea910b1b2fa9223c716bbefac62b25bba4f83f3b50217b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd0dd7a8d986a0dd78568e33afa36fcd8bba88cf885d29b22484132bf0034c47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:33Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:33 crc kubenswrapper[4675]: I0216 19:42:33.352879 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04c59a1a3ae53156b955a90113bc3d3b4424b7dcc3baa91e8256fd0893751af8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:33Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:33 crc kubenswrapper[4675]: I0216 19:42:33.366532 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10e2a1727464113d2cd88dfb8ba52bb1944b3dfe67a0d40f5b2e57fe483e5890\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83a4afeccd65f013b07c0f90cbb94f765f918aaaeeca716935f940deb85403d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:33Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:33 crc kubenswrapper[4675]: I0216 19:42:33.380179 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-stxk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af599569-feb0-432a-9adf-1b72d1ac1a57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5944f34bb922394197cfb79978d19139f70724115954043828e9f609ca32945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-stxk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:33Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:33 crc kubenswrapper[4675]: I0216 19:42:33.402527 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"358c3bab-4666-44d1-90fb-3affa22327e2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e0dcd3707a91b9a3869942408c144f0b4ca22cd64e99de5fe43f95094e44a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d3a4d507eb459edf9f2a3c58417ae320eba85a7b597808fdb0eabe7d7d5afbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd404f7efe7c09d47f0550b7cf66ac7193f315f60db74cd5609593546037c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bfda6b7519474256ab63750ba86955592a2e47a0e364e4170a19fe28270459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e9a715621e072260b3d9703159db5eaad4918ff7a2cb7689dcb914ccd617950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e8bfd47afcf3bbd1831b98e69a3ee5fe7b49dd04d4f2efc9d9dc8fda831c74a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e8bfd47afcf3bbd1831b98e69a3ee5fe7b49dd04d4f2efc9d9dc8fda831c74a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65752e9eeab5fea26d84c7d43a33947d3dbabdd31578480904db24d60d298a0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65752e9eeab5fea26d84c7d43a33947d3dbabdd31578480904db24d60d298a0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bd389e534c9b05c81db5cfb0a9f1f5e78e2f1ccbcce15953172836fb3fb6744f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd389e534c9b05c81db5cfb0a9f1f5e78e2f1ccbcce15953172836fb3fb6744f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:33Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:33 crc kubenswrapper[4675]: I0216 19:42:33.416713 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:33Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:33 crc kubenswrapper[4675]: I0216 19:42:33.440881 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:33 crc kubenswrapper[4675]: I0216 19:42:33.440940 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:33 crc kubenswrapper[4675]: I0216 19:42:33.440953 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:33 crc kubenswrapper[4675]: I0216 19:42:33.440971 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:33 crc kubenswrapper[4675]: I0216 19:42:33.440983 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:33Z","lastTransitionTime":"2026-02-16T19:42:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:33 crc kubenswrapper[4675]: I0216 19:42:33.544150 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:33 crc kubenswrapper[4675]: I0216 19:42:33.544211 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:33 crc kubenswrapper[4675]: I0216 19:42:33.544241 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:33 crc kubenswrapper[4675]: I0216 19:42:33.544265 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:33 crc kubenswrapper[4675]: I0216 19:42:33.544282 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:33Z","lastTransitionTime":"2026-02-16T19:42:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:33 crc kubenswrapper[4675]: I0216 19:42:33.646874 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:33 crc kubenswrapper[4675]: I0216 19:42:33.646927 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:33 crc kubenswrapper[4675]: I0216 19:42:33.646941 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:33 crc kubenswrapper[4675]: I0216 19:42:33.646959 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:33 crc kubenswrapper[4675]: I0216 19:42:33.646971 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:33Z","lastTransitionTime":"2026-02-16T19:42:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:33 crc kubenswrapper[4675]: I0216 19:42:33.749333 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:33 crc kubenswrapper[4675]: I0216 19:42:33.749380 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:33 crc kubenswrapper[4675]: I0216 19:42:33.749391 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:33 crc kubenswrapper[4675]: I0216 19:42:33.749411 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:33 crc kubenswrapper[4675]: I0216 19:42:33.749422 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:33Z","lastTransitionTime":"2026-02-16T19:42:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:33 crc kubenswrapper[4675]: I0216 19:42:33.807051 4675 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 18:55:55.770510166 +0000 UTC Feb 16 19:42:33 crc kubenswrapper[4675]: I0216 19:42:33.852027 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:33 crc kubenswrapper[4675]: I0216 19:42:33.852068 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:33 crc kubenswrapper[4675]: I0216 19:42:33.852080 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:33 crc kubenswrapper[4675]: I0216 19:42:33.852099 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:33 crc kubenswrapper[4675]: I0216 19:42:33.852111 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:33Z","lastTransitionTime":"2026-02-16T19:42:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:33 crc kubenswrapper[4675]: I0216 19:42:33.955117 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:33 crc kubenswrapper[4675]: I0216 19:42:33.955153 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:33 crc kubenswrapper[4675]: I0216 19:42:33.955161 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:33 crc kubenswrapper[4675]: I0216 19:42:33.955202 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:33 crc kubenswrapper[4675]: I0216 19:42:33.955216 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:33Z","lastTransitionTime":"2026-02-16T19:42:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:33 crc kubenswrapper[4675]: I0216 19:42:33.969569 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h9s8p"] Feb 16 19:42:33 crc kubenswrapper[4675]: I0216 19:42:33.970096 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h9s8p" Feb 16 19:42:33 crc kubenswrapper[4675]: I0216 19:42:33.972716 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 16 19:42:33 crc kubenswrapper[4675]: I0216 19:42:33.973230 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 16 19:42:33 crc kubenswrapper[4675]: I0216 19:42:33.994263 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"358c3bab-4666-44d1-90fb-3affa22327e2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e0dcd3707a91b9a3869942408c144f0b4ca22cd64e99de5fe43f95094e44a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d3a4d507eb459edf9f2a3c58417ae320eba85a7b597808fdb0eabe7d7d5afbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd404f7efe7c09d47f0550b7cf66ac7193f315f60db74cd5609593546037c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bfda6b7519474256ab63750ba86955592a2e47a0e364e4170a19fe28270459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e9a715621e072260b3d9703159db5eaad4918ff7a2cb7689dcb914ccd617950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e8bfd47afcf3bbd1831b98e69a3ee5fe7b49dd04d4f2efc9d9dc8fda831c74a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e8bfd47afcf3bbd1831b98e69a3ee5fe7b49dd04d4f2efc9d9dc8fda831c74a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65752e9eeab5fea26d84c7d43a33947d3dbabdd31578480904db24d60d298a0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65752e9eeab5fea26d84c7d43a33947d3dbabdd31578480904db24d60d298a0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bd389e534c9b05c81db5cfb0a9f1f5e78e2f1ccbcce15953172836fb3fb6744f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd389e534c9b05c81db5cfb0a9f1f5e78e2f1ccbcce15953172836fb3fb6744f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:33Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.008816 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:34Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.028751 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:34Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.041188 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e98c1ed8-cbed-4d30-b3fc-41beff5d65c2-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-h9s8p\" (UID: \"e98c1ed8-cbed-4d30-b3fc-41beff5d65c2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h9s8p" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.041726 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-df2gr\" (UniqueName: \"kubernetes.io/projected/e98c1ed8-cbed-4d30-b3fc-41beff5d65c2-kube-api-access-df2gr\") pod \"ovnkube-control-plane-749d76644c-h9s8p\" (UID: \"e98c1ed8-cbed-4d30-b3fc-41beff5d65c2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h9s8p" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.041972 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e98c1ed8-cbed-4d30-b3fc-41beff5d65c2-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-h9s8p\" (UID: \"e98c1ed8-cbed-4d30-b3fc-41beff5d65c2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h9s8p" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.042102 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e98c1ed8-cbed-4d30-b3fc-41beff5d65c2-env-overrides\") pod \"ovnkube-control-plane-749d76644c-h9s8p\" (UID: \"e98c1ed8-cbed-4d30-b3fc-41beff5d65c2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h9s8p" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.045918 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pj5xg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9a99563-d631-455f-8464-160e5619c610\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://798b2a0b186213c744675eb2bd1a33fcb05a8df10795373dbff1f91222fa17a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s94dl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pj5xg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:34Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.057895 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.057937 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.057953 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.057972 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.057982 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:34Z","lastTransitionTime":"2026-02-16T19:42:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.069729 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://958d71da673d02ed4c067f390a0a94c9f20052f6274ec975f9a4c3034efa7aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5282559b57497e15c0bd828702b4994e8d05f7c1a81ee44e582ce3bb3b64cbb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13af49ba7abbe43edde46130d212f6d76f30bab85a63aef18a84721887e29535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e22795f3990f90b901e1ba026e5f95e72508dc865635d19ba881a291dbaed85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9296772284fdb3e0ac56ffd5e54f10e7fe1256ccb616d52f103479ebcc6e81f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ace6ee814e8e429b6ec77e88a0bb2fff35cfbfd3497e7a4d7882e11f437ce5a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33bb9f09621d4a84e0041c922049323caf0373f0ef87a05bba0d4237b8782666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33bb9f09621d4a84e0041c922049323caf0373f0ef87a05bba0d4237b8782666\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-16T19:42:32Z\\\",\\\"message\\\":\\\"9:42:32.347198 5892 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0216 19:42:32.347212 5892 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0216 19:42:32.347197 5892 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0216 19:42:32.347290 5892 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0216 19:42:32.347359 5892 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0216 19:42:32.347440 5892 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0216 19:42:32.347462 5892 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0216 19:42:32.347442 5892 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0216 19:42:32.347508 5892 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0216 19:42:32.347556 5892 handler.go:208] Removed *v1.Node event handler 7\\\\nI0216 19:42:32.347596 5892 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0216 19:42:32.347620 5892 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0216 19:42:32.347640 5892 handler.go:208] Removed *v1.Node event handler 2\\\\nI0216 19:42:32.347664 5892 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0216 19:42:32.347716 5892 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0216 19:42:32.347718 5892 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4a449e69885bb506900d7dbb14ee3ff9eb2d1188267babb04fbe0bb8f057c48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gpc5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:34Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.085449 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10414964-83d0-4d95-a89f-e3212a8015b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db3c34dee6cb538e5de05d8703350e3013401589489756aafbf8fed2ef597d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27l4j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92e33c01bc9214841f42d4ca6e58fb062a9e94fe39bdc5ad08ad2351cd270058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27l4j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j7pnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:34Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.104030 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc6bac98-79c6-4970-ba8b-7996a4046f7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055f2d1052fed3e5e8d660243c3d0595e3168a670ea7a22ef51b199528ffe826\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beefddcc0b5e2e84c15063f467b9043ba9caea11fbc3f7fb8fc1ece3b8c8f75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dbd2e87a95c7964a1bafb765ec76d7f18e180cc0e47df9fde7b526b3717a107\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cc4f98152f284333d41071adaefada973e3396c3cdb2c6ee3c662f0043fe65b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c762d1facdba3325880ae6b1e2704c626ae67915808b527959795774e2a1ba62\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-16T19:42:02Z\\\",\\\"message\\\":\\\"W0216 19:42:01.228572 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0216 19:42:01.228958 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771270921 cert, and key in /tmp/serving-cert-1169484463/serving-signer.crt, /tmp/serving-cert-1169484463/serving-signer.key\\\\nI0216 19:42:01.683881 1 observer_polling.go:159] Starting file observer\\\\nW0216 19:42:01.687387 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0216 19:42:01.693158 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0216 19:42:01.694799 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1169484463/tls.crt::/tmp/serving-cert-1169484463/tls.key\\\\\\\"\\\\nF0216 19:42:02.048422 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://344f8bd7bee340cc040a2556a0e92c055ed7fbad886f85750b66d21ace766e8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:34Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.117953 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18c7d7a1f00611bdd6074314992cb6601ce0c043b831b6125010fe098c7e12ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:34Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.133039 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:34Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.143751 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e98c1ed8-cbed-4d30-b3fc-41beff5d65c2-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-h9s8p\" (UID: \"e98c1ed8-cbed-4d30-b3fc-41beff5d65c2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h9s8p" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.144057 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-df2gr\" (UniqueName: \"kubernetes.io/projected/e98c1ed8-cbed-4d30-b3fc-41beff5d65c2-kube-api-access-df2gr\") pod \"ovnkube-control-plane-749d76644c-h9s8p\" (UID: \"e98c1ed8-cbed-4d30-b3fc-41beff5d65c2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h9s8p" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.144197 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e98c1ed8-cbed-4d30-b3fc-41beff5d65c2-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-h9s8p\" (UID: \"e98c1ed8-cbed-4d30-b3fc-41beff5d65c2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h9s8p" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.144811 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e98c1ed8-cbed-4d30-b3fc-41beff5d65c2-env-overrides\") pod \"ovnkube-control-plane-749d76644c-h9s8p\" (UID: \"e98c1ed8-cbed-4d30-b3fc-41beff5d65c2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h9s8p" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.144597 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e98c1ed8-cbed-4d30-b3fc-41beff5d65c2-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-h9s8p\" (UID: \"e98c1ed8-cbed-4d30-b3fc-41beff5d65c2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h9s8p" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.145272 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e98c1ed8-cbed-4d30-b3fc-41beff5d65c2-env-overrides\") pod \"ovnkube-control-plane-749d76644c-h9s8p\" (UID: \"e98c1ed8-cbed-4d30-b3fc-41beff5d65c2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h9s8p" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.150601 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rg8bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f385df3d-0543-4189-88a6-2163c8b9b959\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f034148fa1b45c35dd6c8b9b70c08b1d745d1377335d7f6d2b35793c29ce9e03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://993eb6f1a848a5279ad2e5d47f47c873918dbca13d2ff7db80bebbad6f948476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://993eb6f1a848a5279ad2e5d47f47c873918dbca13d2ff7db80bebbad6f948476\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02bb51b679cec1e9962a9d52b4fa890f31df7d9c00ed9f4885e94ac52455341c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02bb51b679cec1e9962a9d52b4fa890f31df7d9c00ed9f4885e94ac52455341c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0161d628c911b5e33369708f774f6f8313fb74d7fdc1266b7bfa346d5d6200a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0161d628c911b5e33369708f774f6f8313fb74d7fdc1266b7bfa346d5d6200a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://470da14b61b6a4b60e151726b8a40431c5a38f30416cb9e635e843541e8b28bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://470da14b61b6a4b60e151726b8a40431c5a38f30416cb9e635e843541e8b28bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd74b2f78bf4719b66ec38da6783f0e4ee8b93d27d6c27907212e906c1cd8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbd74b2f78bf4719b66ec38da6783f0e4ee8b93d27d6c27907212e906c1cd8e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f742c6f36d6a537e143339eaa367ff65398f680da2e175aeaed1a4da1886148c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f742c6f36d6a537e143339eaa367ff65398f680da2e175aeaed1a4da1886148c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rg8bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:34Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.151623 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e98c1ed8-cbed-4d30-b3fc-41beff5d65c2-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-h9s8p\" (UID: \"e98c1ed8-cbed-4d30-b3fc-41beff5d65c2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h9s8p" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.160421 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.160484 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.160497 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.160519 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.160535 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:34Z","lastTransitionTime":"2026-02-16T19:42:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.164017 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-df2gr\" (UniqueName: \"kubernetes.io/projected/e98c1ed8-cbed-4d30-b3fc-41beff5d65c2-kube-api-access-df2gr\") pod \"ovnkube-control-plane-749d76644c-h9s8p\" (UID: \"e98c1ed8-cbed-4d30-b3fc-41beff5d65c2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h9s8p" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.167828 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4vvwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2de9472b-0e23-4821-86b8-202bfd739aee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2a601bdf26e518854cf3a55f3aed6274701bc40b20fe0f0e7673e65a007f0b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqw54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4vvwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:34Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.174251 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gpc5z_9b6e2d5a-0472-425b-b5b4-0b94f14ebfba/ovnkube-controller/0.log" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.178191 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" event={"ID":"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba","Type":"ContainerStarted","Data":"2b63e56e590054687fedf2ed3e34d18b7fe5b675ab245cc644b18461752082d9"} Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.178672 4675 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.184971 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"776ad1bc-395c-41c3-8c95-37a58a8cd4e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04dbf4934ecd5c1773fe7c1d032d68d59f41556af022bdf1c36f8de85221616a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abb4dfa515be84263cef6de8c300018193409908292cbfe0d19610ade9532ad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8d3160c906648c15ea910b1b2fa9223c716bbefac62b25bba4f83f3b50217b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd0dd7a8d986a0dd78568e33afa36fcd8bba88cf885d29b22484132bf0034c47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:34Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.198977 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04c59a1a3ae53156b955a90113bc3d3b4424b7dcc3baa91e8256fd0893751af8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:34Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.215248 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10e2a1727464113d2cd88dfb8ba52bb1944b3dfe67a0d40f5b2e57fe483e5890\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83a4afeccd65f013b07c0f90cbb94f765f918aaaeeca716935f940deb85403d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:34Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.226328 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-stxk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af599569-feb0-432a-9adf-1b72d1ac1a57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5944f34bb922394197cfb79978d19139f70724115954043828e9f609ca32945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-stxk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:34Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.238217 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h9s8p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e98c1ed8-cbed-4d30-b3fc-41beff5d65c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df2gr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df2gr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-h9s8p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:34Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.258643 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"358c3bab-4666-44d1-90fb-3affa22327e2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e0dcd3707a91b9a3869942408c144f0b4ca22cd64e99de5fe43f95094e44a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d3a4d507eb459edf9f2a3c58417ae320eba85a7b597808fdb0eabe7d7d5afbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd404f7efe7c09d47f0550b7cf66ac7193f315f60db74cd5609593546037c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bfda6b7519474256ab63750ba86955592a2e47a0e364e4170a19fe28270459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e9a715621e072260b3d9703159db5eaad4918ff7a2cb7689dcb914ccd617950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e8bfd47afcf3bbd1831b98e69a3ee5fe7b49dd04d4f2efc9d9dc8fda831c74a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e8bfd47afcf3bbd1831b98e69a3ee5fe7b49dd04d4f2efc9d9dc8fda831c74a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65752e9eeab5fea26d84c7d43a33947d3dbabdd31578480904db24d60d298a0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65752e9eeab5fea26d84c7d43a33947d3dbabdd31578480904db24d60d298a0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bd389e534c9b05c81db5cfb0a9f1f5e78e2f1ccbcce15953172836fb3fb6744f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd389e534c9b05c81db5cfb0a9f1f5e78e2f1ccbcce15953172836fb3fb6744f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:34Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.263907 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.263946 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.263960 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.263984 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.263999 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:34Z","lastTransitionTime":"2026-02-16T19:42:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.273100 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:34Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.285344 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h9s8p" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.288010 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:34Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.311541 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pj5xg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9a99563-d631-455f-8464-160e5619c610\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://798b2a0b186213c744675eb2bd1a33fcb05a8df10795373dbff1f91222fa17a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s94dl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pj5xg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:34Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.336584 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://958d71da673d02ed4c067f390a0a94c9f20052f6274ec975f9a4c3034efa7aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5282559b57497e15c0bd828702b4994e8d05f7c1a81ee44e582ce3bb3b64cbb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13af49ba7abbe43edde46130d212f6d76f30bab85a63aef18a84721887e29535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e22795f3990f90b901e1ba026e5f95e72508dc865635d19ba881a291dbaed85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9296772284fdb3e0ac56ffd5e54f10e7fe1256ccb616d52f103479ebcc6e81f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ace6ee814e8e429b6ec77e88a0bb2fff35cfbfd3497e7a4d7882e11f437ce5a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b63e56e590054687fedf2ed3e34d18b7fe5b675ab245cc644b18461752082d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33bb9f09621d4a84e0041c922049323caf0373f0ef87a05bba0d4237b8782666\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-16T19:42:32Z\\\",\\\"message\\\":\\\"9:42:32.347198 5892 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0216 19:42:32.347212 5892 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0216 19:42:32.347197 5892 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0216 19:42:32.347290 5892 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0216 19:42:32.347359 5892 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0216 19:42:32.347440 5892 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0216 19:42:32.347462 5892 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0216 19:42:32.347442 5892 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0216 19:42:32.347508 5892 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0216 19:42:32.347556 5892 handler.go:208] Removed *v1.Node event handler 7\\\\nI0216 19:42:32.347596 5892 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0216 19:42:32.347620 5892 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0216 19:42:32.347640 5892 handler.go:208] Removed *v1.Node event handler 2\\\\nI0216 19:42:32.347664 5892 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0216 19:42:32.347716 5892 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0216 19:42:32.347718 5892 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4a449e69885bb506900d7dbb14ee3ff9eb2d1188267babb04fbe0bb8f057c48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gpc5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:34Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.353565 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10414964-83d0-4d95-a89f-e3212a8015b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db3c34dee6cb538e5de05d8703350e3013401589489756aafbf8fed2ef597d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27l4j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92e33c01bc9214841f42d4ca6e58fb062a9e94fe39bdc5ad08ad2351cd270058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27l4j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j7pnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:34Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.366451 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.366497 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.366508 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.366526 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.366541 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:34Z","lastTransitionTime":"2026-02-16T19:42:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.369681 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc6bac98-79c6-4970-ba8b-7996a4046f7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055f2d1052fed3e5e8d660243c3d0595e3168a670ea7a22ef51b199528ffe826\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beefddcc0b5e2e84c15063f467b9043ba9caea11fbc3f7fb8fc1ece3b8c8f75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dbd2e87a95c7964a1bafb765ec76d7f18e180cc0e47df9fde7b526b3717a107\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cc4f98152f284333d41071adaefada973e3396c3cdb2c6ee3c662f0043fe65b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c762d1facdba3325880ae6b1e2704c626ae67915808b527959795774e2a1ba62\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-16T19:42:02Z\\\",\\\"message\\\":\\\"W0216 19:42:01.228572 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0216 19:42:01.228958 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771270921 cert, and key in /tmp/serving-cert-1169484463/serving-signer.crt, /tmp/serving-cert-1169484463/serving-signer.key\\\\nI0216 19:42:01.683881 1 observer_polling.go:159] Starting file observer\\\\nW0216 19:42:01.687387 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0216 19:42:01.693158 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0216 19:42:01.694799 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1169484463/tls.crt::/tmp/serving-cert-1169484463/tls.key\\\\\\\"\\\\nF0216 19:42:02.048422 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://344f8bd7bee340cc040a2556a0e92c055ed7fbad886f85750b66d21ace766e8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:34Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.385848 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18c7d7a1f00611bdd6074314992cb6601ce0c043b831b6125010fe098c7e12ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:34Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.401040 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:34Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.421022 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rg8bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f385df3d-0543-4189-88a6-2163c8b9b959\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f034148fa1b45c35dd6c8b9b70c08b1d745d1377335d7f6d2b35793c29ce9e03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://993eb6f1a848a5279ad2e5d47f47c873918dbca13d2ff7db80bebbad6f948476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://993eb6f1a848a5279ad2e5d47f47c873918dbca13d2ff7db80bebbad6f948476\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02bb51b679cec1e9962a9d52b4fa890f31df7d9c00ed9f4885e94ac52455341c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02bb51b679cec1e9962a9d52b4fa890f31df7d9c00ed9f4885e94ac52455341c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0161d628c911b5e33369708f774f6f8313fb74d7fdc1266b7bfa346d5d6200a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0161d628c911b5e33369708f774f6f8313fb74d7fdc1266b7bfa346d5d6200a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://470da14b61b6a4b60e151726b8a40431c5a38f30416cb9e635e843541e8b28bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://470da14b61b6a4b60e151726b8a40431c5a38f30416cb9e635e843541e8b28bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd74b2f78bf4719b66ec38da6783f0e4ee8b93d27d6c27907212e906c1cd8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbd74b2f78bf4719b66ec38da6783f0e4ee8b93d27d6c27907212e906c1cd8e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f742c6f36d6a537e143339eaa367ff65398f680da2e175aeaed1a4da1886148c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f742c6f36d6a537e143339eaa367ff65398f680da2e175aeaed1a4da1886148c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rg8bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:34Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.434082 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4vvwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2de9472b-0e23-4821-86b8-202bfd739aee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2a601bdf26e518854cf3a55f3aed6274701bc40b20fe0f0e7673e65a007f0b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqw54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4vvwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:34Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.449339 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.449434 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 19:42:34 crc kubenswrapper[4675]: E0216 19:42:34.449588 4675 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 16 19:42:34 crc kubenswrapper[4675]: E0216 19:42:34.449709 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-16 19:42:50.449660083 +0000 UTC m=+53.574949659 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 16 19:42:34 crc kubenswrapper[4675]: E0216 19:42:34.450318 4675 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 16 19:42:34 crc kubenswrapper[4675]: E0216 19:42:34.450380 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-16 19:42:50.4503661 +0000 UTC m=+53.575655676 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.452349 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"776ad1bc-395c-41c3-8c95-37a58a8cd4e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04dbf4934ecd5c1773fe7c1d032d68d59f41556af022bdf1c36f8de85221616a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abb4dfa515be84263cef6de8c300018193409908292cbfe0d19610ade9532ad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8d3160c906648c15ea910b1b2fa9223c716bbefac62b25bba4f83f3b50217b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd0dd7a8d986a0dd78568e33afa36fcd8bba88cf885d29b22484132bf0034c47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:34Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.466509 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04c59a1a3ae53156b955a90113bc3d3b4424b7dcc3baa91e8256fd0893751af8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:34Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.470378 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.470412 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.470423 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.470442 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.470455 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:34Z","lastTransitionTime":"2026-02-16T19:42:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.486523 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10e2a1727464113d2cd88dfb8ba52bb1944b3dfe67a0d40f5b2e57fe483e5890\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83a4afeccd65f013b07c0f90cbb94f765f918aaaeeca716935f940deb85403d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:34Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.503295 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-stxk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af599569-feb0-432a-9adf-1b72d1ac1a57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5944f34bb922394197cfb79978d19139f70724115954043828e9f609ca32945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-stxk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:34Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.521646 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h9s8p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e98c1ed8-cbed-4d30-b3fc-41beff5d65c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df2gr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df2gr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-h9s8p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:34Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.550276 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 19:42:34 crc kubenswrapper[4675]: E0216 19:42:34.550596 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 19:42:50.550547616 +0000 UTC m=+53.675837212 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.550802 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.550947 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 19:42:34 crc kubenswrapper[4675]: E0216 19:42:34.551127 4675 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 16 19:42:34 crc kubenswrapper[4675]: E0216 19:42:34.551205 4675 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 16 19:42:34 crc kubenswrapper[4675]: E0216 19:42:34.551275 4675 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 16 19:42:34 crc kubenswrapper[4675]: E0216 19:42:34.551385 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-16 19:42:50.551365666 +0000 UTC m=+53.676655222 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 16 19:42:34 crc kubenswrapper[4675]: E0216 19:42:34.551208 4675 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 16 19:42:34 crc kubenswrapper[4675]: E0216 19:42:34.551602 4675 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 16 19:42:34 crc kubenswrapper[4675]: E0216 19:42:34.551654 4675 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 16 19:42:34 crc kubenswrapper[4675]: E0216 19:42:34.551839 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-16 19:42:50.551795326 +0000 UTC m=+53.677085032 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.574190 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.574520 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.574643 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.574767 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.574844 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:34Z","lastTransitionTime":"2026-02-16T19:42:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.678017 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.678071 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.678081 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.678099 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.678109 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:34Z","lastTransitionTime":"2026-02-16T19:42:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.781269 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.781598 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.781897 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.782102 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.782332 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:34Z","lastTransitionTime":"2026-02-16T19:42:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.807632 4675 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 23:21:44.697986471 +0000 UTC Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.883331 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.883351 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 19:42:34 crc kubenswrapper[4675]: E0216 19:42:34.883869 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.884230 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 19:42:34 crc kubenswrapper[4675]: E0216 19:42:34.884712 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.885251 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.885337 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.885401 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.885477 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.885549 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:34Z","lastTransitionTime":"2026-02-16T19:42:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:34 crc kubenswrapper[4675]: E0216 19:42:34.885487 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.988732 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.988781 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.988792 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.988816 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:34 crc kubenswrapper[4675]: I0216 19:42:34.988826 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:34Z","lastTransitionTime":"2026-02-16T19:42:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.091662 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.091737 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.091751 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.091806 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.091822 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:35Z","lastTransitionTime":"2026-02-16T19:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.093842 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-sbgjb"] Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.094478 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbgjb" Feb 16 19:42:35 crc kubenswrapper[4675]: E0216 19:42:35.094560 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbgjb" podUID="8d5a5b47-38a4-4f7e-b40e-dba4825e18be" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.123504 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.123568 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.123581 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.123601 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.123613 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:35Z","lastTransitionTime":"2026-02-16T19:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.124816 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"358c3bab-4666-44d1-90fb-3affa22327e2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e0dcd3707a91b9a3869942408c144f0b4ca22cd64e99de5fe43f95094e44a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d3a4d507eb459edf9f2a3c58417ae320eba85a7b597808fdb0eabe7d7d5afbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd404f7efe7c09d47f0550b7cf66ac7193f315f60db74cd5609593546037c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bfda6b7519474256ab63750ba86955592a2e47a0e364e4170a19fe28270459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e9a715621e072260b3d9703159db5eaad4918ff7a2cb7689dcb914ccd617950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e8bfd47afcf3bbd1831b98e69a3ee5fe7b49dd04d4f2efc9d9dc8fda831c74a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e8bfd47afcf3bbd1831b98e69a3ee5fe7b49dd04d4f2efc9d9dc8fda831c74a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65752e9eeab5fea26d84c7d43a33947d3dbabdd31578480904db24d60d298a0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65752e9eeab5fea26d84c7d43a33947d3dbabdd31578480904db24d60d298a0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bd389e534c9b05c81db5cfb0a9f1f5e78e2f1ccbcce15953172836fb3fb6744f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd389e534c9b05c81db5cfb0a9f1f5e78e2f1ccbcce15953172836fb3fb6744f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:35Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:35 crc kubenswrapper[4675]: E0216 19:42:35.139335 4675 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dc074e08-7429-4b93-941d-3d8335487f37\\\",\\\"systemUUID\\\":\\\"a98cec76-f68c-4af0-9517-b326e9616347\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:35Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.144382 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.144423 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.144433 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.144452 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.144462 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:35Z","lastTransitionTime":"2026-02-16T19:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.144460 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:35Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:35 crc kubenswrapper[4675]: E0216 19:42:35.157286 4675 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dc074e08-7429-4b93-941d-3d8335487f37\\\",\\\"systemUUID\\\":\\\"a98cec76-f68c-4af0-9517-b326e9616347\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:35Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.160402 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pj5xg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9a99563-d631-455f-8464-160e5619c610\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://798b2a0b186213c744675eb2bd1a33fcb05a8df10795373dbff1f91222fa17a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s94dl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pj5xg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:35Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.162401 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.162446 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.162456 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.162473 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.162487 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:35Z","lastTransitionTime":"2026-02-16T19:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:35 crc kubenswrapper[4675]: E0216 19:42:35.175174 4675 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dc074e08-7429-4b93-941d-3d8335487f37\\\",\\\"systemUUID\\\":\\\"a98cec76-f68c-4af0-9517-b326e9616347\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:35Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.179625 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.179671 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.179706 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.179769 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.179783 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:35Z","lastTransitionTime":"2026-02-16T19:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.188741 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h9s8p" event={"ID":"e98c1ed8-cbed-4d30-b3fc-41beff5d65c2","Type":"ContainerStarted","Data":"20a2a1cc0b69b60d38a38ab898236d8577ce735cd2e1698f6fd3cb3d3c23b62c"} Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.188799 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h9s8p" event={"ID":"e98c1ed8-cbed-4d30-b3fc-41beff5d65c2","Type":"ContainerStarted","Data":"410f4deab4b31698f80da30412dff846ca9f1004dfab5d4cfe67c6892b3a7aca"} Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.188812 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h9s8p" event={"ID":"e98c1ed8-cbed-4d30-b3fc-41beff5d65c2","Type":"ContainerStarted","Data":"a6970ce6274e1fab8dba7cd8a6b52bc35228f0bba650a33f1818587e3c0e8ced"} Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.189968 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://958d71da673d02ed4c067f390a0a94c9f20052f6274ec975f9a4c3034efa7aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5282559b57497e15c0bd828702b4994e8d05f7c1a81ee44e582ce3bb3b64cbb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13af49ba7abbe43edde46130d212f6d76f30bab85a63aef18a84721887e29535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e22795f3990f90b901e1ba026e5f95e72508dc865635d19ba881a291dbaed85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9296772284fdb3e0ac56ffd5e54f10e7fe1256ccb616d52f103479ebcc6e81f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ace6ee814e8e429b6ec77e88a0bb2fff35cfbfd3497e7a4d7882e11f437ce5a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b63e56e590054687fedf2ed3e34d18b7fe5b675ab245cc644b18461752082d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33bb9f09621d4a84e0041c922049323caf0373f0ef87a05bba0d4237b8782666\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-16T19:42:32Z\\\",\\\"message\\\":\\\"9:42:32.347198 5892 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0216 19:42:32.347212 5892 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0216 19:42:32.347197 5892 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0216 19:42:32.347290 5892 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0216 19:42:32.347359 5892 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0216 19:42:32.347440 5892 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0216 19:42:32.347462 5892 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0216 19:42:32.347442 5892 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0216 19:42:32.347508 5892 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0216 19:42:32.347556 5892 handler.go:208] Removed *v1.Node event handler 7\\\\nI0216 19:42:32.347596 5892 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0216 19:42:32.347620 5892 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0216 19:42:32.347640 5892 handler.go:208] Removed *v1.Node event handler 2\\\\nI0216 19:42:32.347664 5892 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0216 19:42:32.347716 5892 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0216 19:42:32.347718 5892 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4a449e69885bb506900d7dbb14ee3ff9eb2d1188267babb04fbe0bb8f057c48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gpc5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:35Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.192591 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gpc5z_9b6e2d5a-0472-425b-b5b4-0b94f14ebfba/ovnkube-controller/1.log" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.193139 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gpc5z_9b6e2d5a-0472-425b-b5b4-0b94f14ebfba/ovnkube-controller/0.log" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.196897 4675 generic.go:334] "Generic (PLEG): container finished" podID="9b6e2d5a-0472-425b-b5b4-0b94f14ebfba" containerID="2b63e56e590054687fedf2ed3e34d18b7fe5b675ab245cc644b18461752082d9" exitCode=1 Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.196947 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" event={"ID":"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba","Type":"ContainerDied","Data":"2b63e56e590054687fedf2ed3e34d18b7fe5b675ab245cc644b18461752082d9"} Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.196994 4675 scope.go:117] "RemoveContainer" containerID="33bb9f09621d4a84e0041c922049323caf0373f0ef87a05bba0d4237b8782666" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.198015 4675 scope.go:117] "RemoveContainer" containerID="2b63e56e590054687fedf2ed3e34d18b7fe5b675ab245cc644b18461752082d9" Feb 16 19:42:35 crc kubenswrapper[4675]: E0216 19:42:35.198200 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-gpc5z_openshift-ovn-kubernetes(9b6e2d5a-0472-425b-b5b4-0b94f14ebfba)\"" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" podUID="9b6e2d5a-0472-425b-b5b4-0b94f14ebfba" Feb 16 19:42:35 crc kubenswrapper[4675]: E0216 19:42:35.200795 4675 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dc074e08-7429-4b93-941d-3d8335487f37\\\",\\\"systemUUID\\\":\\\"a98cec76-f68c-4af0-9517-b326e9616347\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:35Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.205159 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.205187 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.205195 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.205212 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.205221 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:35Z","lastTransitionTime":"2026-02-16T19:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.206875 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10414964-83d0-4d95-a89f-e3212a8015b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db3c34dee6cb538e5de05d8703350e3013401589489756aafbf8fed2ef597d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27l4j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92e33c01bc9214841f42d4ca6e58fb062a9e94fe39bdc5ad08ad2351cd270058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27l4j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j7pnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:35Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:35 crc kubenswrapper[4675]: E0216 19:42:35.218326 4675 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dc074e08-7429-4b93-941d-3d8335487f37\\\",\\\"systemUUID\\\":\\\"a98cec76-f68c-4af0-9517-b326e9616347\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:35Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:35 crc kubenswrapper[4675]: E0216 19:42:35.218854 4675 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.220741 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.220776 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.220789 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.220810 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.220828 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:35Z","lastTransitionTime":"2026-02-16T19:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.223496 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc6bac98-79c6-4970-ba8b-7996a4046f7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055f2d1052fed3e5e8d660243c3d0595e3168a670ea7a22ef51b199528ffe826\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beefddcc0b5e2e84c15063f467b9043ba9caea11fbc3f7fb8fc1ece3b8c8f75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dbd2e87a95c7964a1bafb765ec76d7f18e180cc0e47df9fde7b526b3717a107\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cc4f98152f284333d41071adaefada973e3396c3cdb2c6ee3c662f0043fe65b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c762d1facdba3325880ae6b1e2704c626ae67915808b527959795774e2a1ba62\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-16T19:42:02Z\\\",\\\"message\\\":\\\"W0216 19:42:01.228572 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0216 19:42:01.228958 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771270921 cert, and key in /tmp/serving-cert-1169484463/serving-signer.crt, /tmp/serving-cert-1169484463/serving-signer.key\\\\nI0216 19:42:01.683881 1 observer_polling.go:159] Starting file observer\\\\nW0216 19:42:01.687387 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0216 19:42:01.693158 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0216 19:42:01.694799 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1169484463/tls.crt::/tmp/serving-cert-1169484463/tls.key\\\\\\\"\\\\nF0216 19:42:02.048422 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://344f8bd7bee340cc040a2556a0e92c055ed7fbad886f85750b66d21ace766e8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:35Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.237269 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18c7d7a1f00611bdd6074314992cb6601ce0c043b831b6125010fe098c7e12ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:35Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.251866 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:35Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.262228 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8d5a5b47-38a4-4f7e-b40e-dba4825e18be-metrics-certs\") pod \"network-metrics-daemon-sbgjb\" (UID: \"8d5a5b47-38a4-4f7e-b40e-dba4825e18be\") " pod="openshift-multus/network-metrics-daemon-sbgjb" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.262297 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nm4b4\" (UniqueName: \"kubernetes.io/projected/8d5a5b47-38a4-4f7e-b40e-dba4825e18be-kube-api-access-nm4b4\") pod \"network-metrics-daemon-sbgjb\" (UID: \"8d5a5b47-38a4-4f7e-b40e-dba4825e18be\") " pod="openshift-multus/network-metrics-daemon-sbgjb" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.264898 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:35Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.281642 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rg8bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f385df3d-0543-4189-88a6-2163c8b9b959\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f034148fa1b45c35dd6c8b9b70c08b1d745d1377335d7f6d2b35793c29ce9e03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://993eb6f1a848a5279ad2e5d47f47c873918dbca13d2ff7db80bebbad6f948476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://993eb6f1a848a5279ad2e5d47f47c873918dbca13d2ff7db80bebbad6f948476\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02bb51b679cec1e9962a9d52b4fa890f31df7d9c00ed9f4885e94ac52455341c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02bb51b679cec1e9962a9d52b4fa890f31df7d9c00ed9f4885e94ac52455341c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0161d628c911b5e33369708f774f6f8313fb74d7fdc1266b7bfa346d5d6200a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0161d628c911b5e33369708f774f6f8313fb74d7fdc1266b7bfa346d5d6200a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://470da14b61b6a4b60e151726b8a40431c5a38f30416cb9e635e843541e8b28bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://470da14b61b6a4b60e151726b8a40431c5a38f30416cb9e635e843541e8b28bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd74b2f78bf4719b66ec38da6783f0e4ee8b93d27d6c27907212e906c1cd8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbd74b2f78bf4719b66ec38da6783f0e4ee8b93d27d6c27907212e906c1cd8e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f742c6f36d6a537e143339eaa367ff65398f680da2e175aeaed1a4da1886148c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f742c6f36d6a537e143339eaa367ff65398f680da2e175aeaed1a4da1886148c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rg8bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:35Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.294951 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4vvwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2de9472b-0e23-4821-86b8-202bfd739aee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2a601bdf26e518854cf3a55f3aed6274701bc40b20fe0f0e7673e65a007f0b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqw54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4vvwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:35Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.314850 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"776ad1bc-395c-41c3-8c95-37a58a8cd4e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04dbf4934ecd5c1773fe7c1d032d68d59f41556af022bdf1c36f8de85221616a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abb4dfa515be84263cef6de8c300018193409908292cbfe0d19610ade9532ad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8d3160c906648c15ea910b1b2fa9223c716bbefac62b25bba4f83f3b50217b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd0dd7a8d986a0dd78568e33afa36fcd8bba88cf885d29b22484132bf0034c47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:35Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.323450 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.323497 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.323509 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.323532 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.323547 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:35Z","lastTransitionTime":"2026-02-16T19:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.330421 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04c59a1a3ae53156b955a90113bc3d3b4424b7dcc3baa91e8256fd0893751af8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:35Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.343122 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sbgjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d5a5b47-38a4-4f7e-b40e-dba4825e18be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm4b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm4b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sbgjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:35Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.357661 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10e2a1727464113d2cd88dfb8ba52bb1944b3dfe67a0d40f5b2e57fe483e5890\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83a4afeccd65f013b07c0f90cbb94f765f918aaaeeca716935f940deb85403d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:35Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.362931 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8d5a5b47-38a4-4f7e-b40e-dba4825e18be-metrics-certs\") pod \"network-metrics-daemon-sbgjb\" (UID: \"8d5a5b47-38a4-4f7e-b40e-dba4825e18be\") " pod="openshift-multus/network-metrics-daemon-sbgjb" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.363021 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nm4b4\" (UniqueName: \"kubernetes.io/projected/8d5a5b47-38a4-4f7e-b40e-dba4825e18be-kube-api-access-nm4b4\") pod \"network-metrics-daemon-sbgjb\" (UID: \"8d5a5b47-38a4-4f7e-b40e-dba4825e18be\") " pod="openshift-multus/network-metrics-daemon-sbgjb" Feb 16 19:42:35 crc kubenswrapper[4675]: E0216 19:42:35.363579 4675 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 16 19:42:35 crc kubenswrapper[4675]: E0216 19:42:35.363681 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8d5a5b47-38a4-4f7e-b40e-dba4825e18be-metrics-certs podName:8d5a5b47-38a4-4f7e-b40e-dba4825e18be nodeName:}" failed. No retries permitted until 2026-02-16 19:42:35.863647646 +0000 UTC m=+38.988937412 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8d5a5b47-38a4-4f7e-b40e-dba4825e18be-metrics-certs") pod "network-metrics-daemon-sbgjb" (UID: "8d5a5b47-38a4-4f7e-b40e-dba4825e18be") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.372034 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-stxk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af599569-feb0-432a-9adf-1b72d1ac1a57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5944f34bb922394197cfb79978d19139f70724115954043828e9f609ca32945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-stxk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:35Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.383390 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nm4b4\" (UniqueName: \"kubernetes.io/projected/8d5a5b47-38a4-4f7e-b40e-dba4825e18be-kube-api-access-nm4b4\") pod \"network-metrics-daemon-sbgjb\" (UID: \"8d5a5b47-38a4-4f7e-b40e-dba4825e18be\") " pod="openshift-multus/network-metrics-daemon-sbgjb" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.389304 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h9s8p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e98c1ed8-cbed-4d30-b3fc-41beff5d65c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df2gr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df2gr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-h9s8p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:35Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.405101 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10e2a1727464113d2cd88dfb8ba52bb1944b3dfe67a0d40f5b2e57fe483e5890\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83a4afeccd65f013b07c0f90cbb94f765f918aaaeeca716935f940deb85403d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:35Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.419401 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-stxk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af599569-feb0-432a-9adf-1b72d1ac1a57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5944f34bb922394197cfb79978d19139f70724115954043828e9f609ca32945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-stxk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:35Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.425674 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.425736 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.425747 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.425764 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.425779 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:35Z","lastTransitionTime":"2026-02-16T19:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.433634 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h9s8p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e98c1ed8-cbed-4d30-b3fc-41beff5d65c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://410f4deab4b31698f80da30412dff846ca9f1004dfab5d4cfe67c6892b3a7aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df2gr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20a2a1cc0b69b60d38a38ab898236d8577ce735cd2e1698f6fd3cb3d3c23b62c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df2gr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-h9s8p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:35Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.460276 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"358c3bab-4666-44d1-90fb-3affa22327e2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e0dcd3707a91b9a3869942408c144f0b4ca22cd64e99de5fe43f95094e44a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d3a4d507eb459edf9f2a3c58417ae320eba85a7b597808fdb0eabe7d7d5afbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd404f7efe7c09d47f0550b7cf66ac7193f315f60db74cd5609593546037c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bfda6b7519474256ab63750ba86955592a2e47a0e364e4170a19fe28270459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e9a715621e072260b3d9703159db5eaad4918ff7a2cb7689dcb914ccd617950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e8bfd47afcf3bbd1831b98e69a3ee5fe7b49dd04d4f2efc9d9dc8fda831c74a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e8bfd47afcf3bbd1831b98e69a3ee5fe7b49dd04d4f2efc9d9dc8fda831c74a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65752e9eeab5fea26d84c7d43a33947d3dbabdd31578480904db24d60d298a0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65752e9eeab5fea26d84c7d43a33947d3dbabdd31578480904db24d60d298a0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bd389e534c9b05c81db5cfb0a9f1f5e78e2f1ccbcce15953172836fb3fb6744f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd389e534c9b05c81db5cfb0a9f1f5e78e2f1ccbcce15953172836fb3fb6744f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:35Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.475392 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:35Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.493927 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:35Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.507722 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:35Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.523495 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pj5xg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9a99563-d631-455f-8464-160e5619c610\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://798b2a0b186213c744675eb2bd1a33fcb05a8df10795373dbff1f91222fa17a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s94dl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pj5xg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:35Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.528296 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.528335 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.528347 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.528367 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.528379 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:35Z","lastTransitionTime":"2026-02-16T19:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.547637 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://958d71da673d02ed4c067f390a0a94c9f20052f6274ec975f9a4c3034efa7aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5282559b57497e15c0bd828702b4994e8d05f7c1a81ee44e582ce3bb3b64cbb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13af49ba7abbe43edde46130d212f6d76f30bab85a63aef18a84721887e29535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e22795f3990f90b901e1ba026e5f95e72508dc865635d19ba881a291dbaed85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9296772284fdb3e0ac56ffd5e54f10e7fe1256ccb616d52f103479ebcc6e81f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ace6ee814e8e429b6ec77e88a0bb2fff35cfbfd3497e7a4d7882e11f437ce5a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b63e56e590054687fedf2ed3e34d18b7fe5b675ab245cc644b18461752082d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33bb9f09621d4a84e0041c922049323caf0373f0ef87a05bba0d4237b8782666\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-16T19:42:32Z\\\",\\\"message\\\":\\\"9:42:32.347198 5892 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0216 19:42:32.347212 5892 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0216 19:42:32.347197 5892 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0216 19:42:32.347290 5892 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0216 19:42:32.347359 5892 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0216 19:42:32.347440 5892 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0216 19:42:32.347462 5892 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0216 19:42:32.347442 5892 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0216 19:42:32.347508 5892 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0216 19:42:32.347556 5892 handler.go:208] Removed *v1.Node event handler 7\\\\nI0216 19:42:32.347596 5892 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0216 19:42:32.347620 5892 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0216 19:42:32.347640 5892 handler.go:208] Removed *v1.Node event handler 2\\\\nI0216 19:42:32.347664 5892 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0216 19:42:32.347716 5892 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0216 19:42:32.347718 5892 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b63e56e590054687fedf2ed3e34d18b7fe5b675ab245cc644b18461752082d9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-16T19:42:34Z\\\",\\\"message\\\":\\\"per:true service.alpha.openshift.io/serving-cert-secret-name:config-operator-serving-cert service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc006f7818f \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 8443 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app: openshift-config-operator,},ClusterIP:10.217.4.161,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.161],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0216 19:42:34.165060 6088 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI0216 19:42:34.165067 6088 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4a449e69885bb506900d7dbb14ee3ff9eb2d1188267babb04fbe0bb8f057c48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gpc5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:35Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.560471 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10414964-83d0-4d95-a89f-e3212a8015b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db3c34dee6cb538e5de05d8703350e3013401589489756aafbf8fed2ef597d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27l4j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92e33c01bc9214841f42d4ca6e58fb062a9e94fe39bdc5ad08ad2351cd270058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27l4j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j7pnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:35Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.575822 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc6bac98-79c6-4970-ba8b-7996a4046f7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055f2d1052fed3e5e8d660243c3d0595e3168a670ea7a22ef51b199528ffe826\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beefddcc0b5e2e84c15063f467b9043ba9caea11fbc3f7fb8fc1ece3b8c8f75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dbd2e87a95c7964a1bafb765ec76d7f18e180cc0e47df9fde7b526b3717a107\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cc4f98152f284333d41071adaefada973e3396c3cdb2c6ee3c662f0043fe65b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c762d1facdba3325880ae6b1e2704c626ae67915808b527959795774e2a1ba62\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-16T19:42:02Z\\\",\\\"message\\\":\\\"W0216 19:42:01.228572 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0216 19:42:01.228958 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771270921 cert, and key in /tmp/serving-cert-1169484463/serving-signer.crt, /tmp/serving-cert-1169484463/serving-signer.key\\\\nI0216 19:42:01.683881 1 observer_polling.go:159] Starting file observer\\\\nW0216 19:42:01.687387 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0216 19:42:01.693158 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0216 19:42:01.694799 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1169484463/tls.crt::/tmp/serving-cert-1169484463/tls.key\\\\\\\"\\\\nF0216 19:42:02.048422 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://344f8bd7bee340cc040a2556a0e92c055ed7fbad886f85750b66d21ace766e8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:35Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.599707 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18c7d7a1f00611bdd6074314992cb6601ce0c043b831b6125010fe098c7e12ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:35Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.616308 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rg8bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f385df3d-0543-4189-88a6-2163c8b9b959\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f034148fa1b45c35dd6c8b9b70c08b1d745d1377335d7f6d2b35793c29ce9e03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://993eb6f1a848a5279ad2e5d47f47c873918dbca13d2ff7db80bebbad6f948476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://993eb6f1a848a5279ad2e5d47f47c873918dbca13d2ff7db80bebbad6f948476\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02bb51b679cec1e9962a9d52b4fa890f31df7d9c00ed9f4885e94ac52455341c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02bb51b679cec1e9962a9d52b4fa890f31df7d9c00ed9f4885e94ac52455341c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0161d628c911b5e33369708f774f6f8313fb74d7fdc1266b7bfa346d5d6200a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0161d628c911b5e33369708f774f6f8313fb74d7fdc1266b7bfa346d5d6200a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://470da14b61b6a4b60e151726b8a40431c5a38f30416cb9e635e843541e8b28bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://470da14b61b6a4b60e151726b8a40431c5a38f30416cb9e635e843541e8b28bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd74b2f78bf4719b66ec38da6783f0e4ee8b93d27d6c27907212e906c1cd8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbd74b2f78bf4719b66ec38da6783f0e4ee8b93d27d6c27907212e906c1cd8e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f742c6f36d6a537e143339eaa367ff65398f680da2e175aeaed1a4da1886148c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f742c6f36d6a537e143339eaa367ff65398f680da2e175aeaed1a4da1886148c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rg8bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:35Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.630588 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4vvwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2de9472b-0e23-4821-86b8-202bfd739aee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2a601bdf26e518854cf3a55f3aed6274701bc40b20fe0f0e7673e65a007f0b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqw54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4vvwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:35Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.633257 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.633332 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.633355 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.633383 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.633429 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:35Z","lastTransitionTime":"2026-02-16T19:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.648309 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"776ad1bc-395c-41c3-8c95-37a58a8cd4e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04dbf4934ecd5c1773fe7c1d032d68d59f41556af022bdf1c36f8de85221616a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abb4dfa515be84263cef6de8c300018193409908292cbfe0d19610ade9532ad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8d3160c906648c15ea910b1b2fa9223c716bbefac62b25bba4f83f3b50217b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd0dd7a8d986a0dd78568e33afa36fcd8bba88cf885d29b22484132bf0034c47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:35Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.661615 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04c59a1a3ae53156b955a90113bc3d3b4424b7dcc3baa91e8256fd0893751af8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:35Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.673943 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sbgjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d5a5b47-38a4-4f7e-b40e-dba4825e18be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm4b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm4b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sbgjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:35Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.735505 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.735555 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.735567 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.735588 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.735601 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:35Z","lastTransitionTime":"2026-02-16T19:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.809371 4675 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 08:13:03.989368057 +0000 UTC Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.838749 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.838796 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.838811 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.838829 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.838842 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:35Z","lastTransitionTime":"2026-02-16T19:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.868215 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8d5a5b47-38a4-4f7e-b40e-dba4825e18be-metrics-certs\") pod \"network-metrics-daemon-sbgjb\" (UID: \"8d5a5b47-38a4-4f7e-b40e-dba4825e18be\") " pod="openshift-multus/network-metrics-daemon-sbgjb" Feb 16 19:42:35 crc kubenswrapper[4675]: E0216 19:42:35.868429 4675 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 16 19:42:35 crc kubenswrapper[4675]: E0216 19:42:35.868504 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8d5a5b47-38a4-4f7e-b40e-dba4825e18be-metrics-certs podName:8d5a5b47-38a4-4f7e-b40e-dba4825e18be nodeName:}" failed. No retries permitted until 2026-02-16 19:42:36.868483681 +0000 UTC m=+39.993773237 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8d5a5b47-38a4-4f7e-b40e-dba4825e18be-metrics-certs") pod "network-metrics-daemon-sbgjb" (UID: "8d5a5b47-38a4-4f7e-b40e-dba4825e18be") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.942313 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.942402 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.942414 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.942433 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:35 crc kubenswrapper[4675]: I0216 19:42:35.942444 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:35Z","lastTransitionTime":"2026-02-16T19:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:36 crc kubenswrapper[4675]: I0216 19:42:36.045937 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:36 crc kubenswrapper[4675]: I0216 19:42:36.045983 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:36 crc kubenswrapper[4675]: I0216 19:42:36.045994 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:36 crc kubenswrapper[4675]: I0216 19:42:36.046011 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:36 crc kubenswrapper[4675]: I0216 19:42:36.046022 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:36Z","lastTransitionTime":"2026-02-16T19:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:36 crc kubenswrapper[4675]: I0216 19:42:36.149184 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:36 crc kubenswrapper[4675]: I0216 19:42:36.149255 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:36 crc kubenswrapper[4675]: I0216 19:42:36.149292 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:36 crc kubenswrapper[4675]: I0216 19:42:36.149315 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:36 crc kubenswrapper[4675]: I0216 19:42:36.149326 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:36Z","lastTransitionTime":"2026-02-16T19:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:36 crc kubenswrapper[4675]: I0216 19:42:36.203289 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gpc5z_9b6e2d5a-0472-425b-b5b4-0b94f14ebfba/ovnkube-controller/1.log" Feb 16 19:42:36 crc kubenswrapper[4675]: I0216 19:42:36.253203 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:36 crc kubenswrapper[4675]: I0216 19:42:36.253247 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:36 crc kubenswrapper[4675]: I0216 19:42:36.253255 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:36 crc kubenswrapper[4675]: I0216 19:42:36.253272 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:36 crc kubenswrapper[4675]: I0216 19:42:36.253287 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:36Z","lastTransitionTime":"2026-02-16T19:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:36 crc kubenswrapper[4675]: I0216 19:42:36.356360 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:36 crc kubenswrapper[4675]: I0216 19:42:36.356417 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:36 crc kubenswrapper[4675]: I0216 19:42:36.356457 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:36 crc kubenswrapper[4675]: I0216 19:42:36.356482 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:36 crc kubenswrapper[4675]: I0216 19:42:36.356497 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:36Z","lastTransitionTime":"2026-02-16T19:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:36 crc kubenswrapper[4675]: I0216 19:42:36.458874 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:36 crc kubenswrapper[4675]: I0216 19:42:36.458933 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:36 crc kubenswrapper[4675]: I0216 19:42:36.458950 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:36 crc kubenswrapper[4675]: I0216 19:42:36.458981 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:36 crc kubenswrapper[4675]: I0216 19:42:36.458999 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:36Z","lastTransitionTime":"2026-02-16T19:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:36 crc kubenswrapper[4675]: I0216 19:42:36.562060 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:36 crc kubenswrapper[4675]: I0216 19:42:36.562408 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:36 crc kubenswrapper[4675]: I0216 19:42:36.562474 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:36 crc kubenswrapper[4675]: I0216 19:42:36.562503 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:36 crc kubenswrapper[4675]: I0216 19:42:36.562513 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:36Z","lastTransitionTime":"2026-02-16T19:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:36 crc kubenswrapper[4675]: I0216 19:42:36.665666 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:36 crc kubenswrapper[4675]: I0216 19:42:36.665783 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:36 crc kubenswrapper[4675]: I0216 19:42:36.665803 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:36 crc kubenswrapper[4675]: I0216 19:42:36.665831 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:36 crc kubenswrapper[4675]: I0216 19:42:36.665853 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:36Z","lastTransitionTime":"2026-02-16T19:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:36 crc kubenswrapper[4675]: I0216 19:42:36.769162 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:36 crc kubenswrapper[4675]: I0216 19:42:36.769205 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:36 crc kubenswrapper[4675]: I0216 19:42:36.769214 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:36 crc kubenswrapper[4675]: I0216 19:42:36.769233 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:36 crc kubenswrapper[4675]: I0216 19:42:36.769248 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:36Z","lastTransitionTime":"2026-02-16T19:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:36 crc kubenswrapper[4675]: I0216 19:42:36.810401 4675 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 10:08:21.635819908 +0000 UTC Feb 16 19:42:36 crc kubenswrapper[4675]: I0216 19:42:36.871749 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:36 crc kubenswrapper[4675]: I0216 19:42:36.871823 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:36 crc kubenswrapper[4675]: I0216 19:42:36.871842 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:36 crc kubenswrapper[4675]: I0216 19:42:36.871878 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:36 crc kubenswrapper[4675]: I0216 19:42:36.871901 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:36Z","lastTransitionTime":"2026-02-16T19:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:36 crc kubenswrapper[4675]: I0216 19:42:36.879260 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8d5a5b47-38a4-4f7e-b40e-dba4825e18be-metrics-certs\") pod \"network-metrics-daemon-sbgjb\" (UID: \"8d5a5b47-38a4-4f7e-b40e-dba4825e18be\") " pod="openshift-multus/network-metrics-daemon-sbgjb" Feb 16 19:42:36 crc kubenswrapper[4675]: E0216 19:42:36.879479 4675 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 16 19:42:36 crc kubenswrapper[4675]: E0216 19:42:36.879611 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8d5a5b47-38a4-4f7e-b40e-dba4825e18be-metrics-certs podName:8d5a5b47-38a4-4f7e-b40e-dba4825e18be nodeName:}" failed. No retries permitted until 2026-02-16 19:42:38.879577265 +0000 UTC m=+42.004866851 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8d5a5b47-38a4-4f7e-b40e-dba4825e18be-metrics-certs") pod "network-metrics-daemon-sbgjb" (UID: "8d5a5b47-38a4-4f7e-b40e-dba4825e18be") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 16 19:42:36 crc kubenswrapper[4675]: I0216 19:42:36.884233 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 19:42:36 crc kubenswrapper[4675]: I0216 19:42:36.884323 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 19:42:36 crc kubenswrapper[4675]: I0216 19:42:36.884328 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbgjb" Feb 16 19:42:36 crc kubenswrapper[4675]: E0216 19:42:36.884410 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 19:42:36 crc kubenswrapper[4675]: I0216 19:42:36.884243 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 19:42:36 crc kubenswrapper[4675]: E0216 19:42:36.884582 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbgjb" podUID="8d5a5b47-38a4-4f7e-b40e-dba4825e18be" Feb 16 19:42:36 crc kubenswrapper[4675]: E0216 19:42:36.884760 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 19:42:36 crc kubenswrapper[4675]: E0216 19:42:36.884836 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 19:42:36 crc kubenswrapper[4675]: I0216 19:42:36.975259 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:36 crc kubenswrapper[4675]: I0216 19:42:36.975333 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:36 crc kubenswrapper[4675]: I0216 19:42:36.975362 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:36 crc kubenswrapper[4675]: I0216 19:42:36.975393 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:36 crc kubenswrapper[4675]: I0216 19:42:36.975417 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:36Z","lastTransitionTime":"2026-02-16T19:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:37 crc kubenswrapper[4675]: I0216 19:42:37.078589 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:37 crc kubenswrapper[4675]: I0216 19:42:37.078654 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:37 crc kubenswrapper[4675]: I0216 19:42:37.078676 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:37 crc kubenswrapper[4675]: I0216 19:42:37.078740 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:37 crc kubenswrapper[4675]: I0216 19:42:37.078758 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:37Z","lastTransitionTime":"2026-02-16T19:42:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:37 crc kubenswrapper[4675]: I0216 19:42:37.181501 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:37 crc kubenswrapper[4675]: I0216 19:42:37.181553 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:37 crc kubenswrapper[4675]: I0216 19:42:37.181569 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:37 crc kubenswrapper[4675]: I0216 19:42:37.181593 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:37 crc kubenswrapper[4675]: I0216 19:42:37.181609 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:37Z","lastTransitionTime":"2026-02-16T19:42:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:37 crc kubenswrapper[4675]: I0216 19:42:37.284363 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:37 crc kubenswrapper[4675]: I0216 19:42:37.284419 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:37 crc kubenswrapper[4675]: I0216 19:42:37.284429 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:37 crc kubenswrapper[4675]: I0216 19:42:37.284449 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:37 crc kubenswrapper[4675]: I0216 19:42:37.284460 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:37Z","lastTransitionTime":"2026-02-16T19:42:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:37 crc kubenswrapper[4675]: I0216 19:42:37.387541 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:37 crc kubenswrapper[4675]: I0216 19:42:37.387625 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:37 crc kubenswrapper[4675]: I0216 19:42:37.387636 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:37 crc kubenswrapper[4675]: I0216 19:42:37.387657 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:37 crc kubenswrapper[4675]: I0216 19:42:37.387670 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:37Z","lastTransitionTime":"2026-02-16T19:42:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:37 crc kubenswrapper[4675]: I0216 19:42:37.490803 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:37 crc kubenswrapper[4675]: I0216 19:42:37.490854 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:37 crc kubenswrapper[4675]: I0216 19:42:37.490882 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:37 crc kubenswrapper[4675]: I0216 19:42:37.490905 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:37 crc kubenswrapper[4675]: I0216 19:42:37.490916 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:37Z","lastTransitionTime":"2026-02-16T19:42:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:37 crc kubenswrapper[4675]: I0216 19:42:37.594661 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:37 crc kubenswrapper[4675]: I0216 19:42:37.594805 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:37 crc kubenswrapper[4675]: I0216 19:42:37.594836 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:37 crc kubenswrapper[4675]: I0216 19:42:37.594873 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:37 crc kubenswrapper[4675]: I0216 19:42:37.594893 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:37Z","lastTransitionTime":"2026-02-16T19:42:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:37 crc kubenswrapper[4675]: I0216 19:42:37.698030 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:37 crc kubenswrapper[4675]: I0216 19:42:37.698401 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:37 crc kubenswrapper[4675]: I0216 19:42:37.698419 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:37 crc kubenswrapper[4675]: I0216 19:42:37.698438 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:37 crc kubenswrapper[4675]: I0216 19:42:37.698450 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:37Z","lastTransitionTime":"2026-02-16T19:42:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:37 crc kubenswrapper[4675]: I0216 19:42:37.801391 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:37 crc kubenswrapper[4675]: I0216 19:42:37.801432 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:37 crc kubenswrapper[4675]: I0216 19:42:37.801441 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:37 crc kubenswrapper[4675]: I0216 19:42:37.801456 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:37 crc kubenswrapper[4675]: I0216 19:42:37.801467 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:37Z","lastTransitionTime":"2026-02-16T19:42:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:37 crc kubenswrapper[4675]: I0216 19:42:37.811501 4675 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 22:36:43.257995893 +0000 UTC Feb 16 19:42:37 crc kubenswrapper[4675]: I0216 19:42:37.901671 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10e2a1727464113d2cd88dfb8ba52bb1944b3dfe67a0d40f5b2e57fe483e5890\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83a4afeccd65f013b07c0f90cbb94f765f918aaaeeca716935f940deb85403d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:37Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:37 crc kubenswrapper[4675]: I0216 19:42:37.904110 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:37 crc kubenswrapper[4675]: I0216 19:42:37.904147 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:37 crc kubenswrapper[4675]: I0216 19:42:37.904160 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:37 crc kubenswrapper[4675]: I0216 19:42:37.904179 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:37 crc kubenswrapper[4675]: I0216 19:42:37.904195 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:37Z","lastTransitionTime":"2026-02-16T19:42:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:37 crc kubenswrapper[4675]: I0216 19:42:37.916045 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-stxk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af599569-feb0-432a-9adf-1b72d1ac1a57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5944f34bb922394197cfb79978d19139f70724115954043828e9f609ca32945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-stxk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:37Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:37 crc kubenswrapper[4675]: I0216 19:42:37.931923 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h9s8p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e98c1ed8-cbed-4d30-b3fc-41beff5d65c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://410f4deab4b31698f80da30412dff846ca9f1004dfab5d4cfe67c6892b3a7aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df2gr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20a2a1cc0b69b60d38a38ab898236d8577ce735cd2e1698f6fd3cb3d3c23b62c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df2gr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-h9s8p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:37Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:37 crc kubenswrapper[4675]: I0216 19:42:37.954939 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"358c3bab-4666-44d1-90fb-3affa22327e2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e0dcd3707a91b9a3869942408c144f0b4ca22cd64e99de5fe43f95094e44a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d3a4d507eb459edf9f2a3c58417ae320eba85a7b597808fdb0eabe7d7d5afbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd404f7efe7c09d47f0550b7cf66ac7193f315f60db74cd5609593546037c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bfda6b7519474256ab63750ba86955592a2e47a0e364e4170a19fe28270459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e9a715621e072260b3d9703159db5eaad4918ff7a2cb7689dcb914ccd617950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e8bfd47afcf3bbd1831b98e69a3ee5fe7b49dd04d4f2efc9d9dc8fda831c74a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e8bfd47afcf3bbd1831b98e69a3ee5fe7b49dd04d4f2efc9d9dc8fda831c74a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65752e9eeab5fea26d84c7d43a33947d3dbabdd31578480904db24d60d298a0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65752e9eeab5fea26d84c7d43a33947d3dbabdd31578480904db24d60d298a0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bd389e534c9b05c81db5cfb0a9f1f5e78e2f1ccbcce15953172836fb3fb6744f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd389e534c9b05c81db5cfb0a9f1f5e78e2f1ccbcce15953172836fb3fb6744f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:37Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:37 crc kubenswrapper[4675]: I0216 19:42:37.975906 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:37Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:37 crc kubenswrapper[4675]: I0216 19:42:37.993902 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:37Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:38 crc kubenswrapper[4675]: I0216 19:42:38.006906 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:38 crc kubenswrapper[4675]: I0216 19:42:38.006986 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:38 crc kubenswrapper[4675]: I0216 19:42:38.007006 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:38 crc kubenswrapper[4675]: I0216 19:42:38.007035 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:38 crc kubenswrapper[4675]: I0216 19:42:38.007055 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:38Z","lastTransitionTime":"2026-02-16T19:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:38 crc kubenswrapper[4675]: I0216 19:42:38.016466 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pj5xg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9a99563-d631-455f-8464-160e5619c610\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://798b2a0b186213c744675eb2bd1a33fcb05a8df10795373dbff1f91222fa17a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s94dl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pj5xg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:38Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:38 crc kubenswrapper[4675]: I0216 19:42:38.041462 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://958d71da673d02ed4c067f390a0a94c9f20052f6274ec975f9a4c3034efa7aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5282559b57497e15c0bd828702b4994e8d05f7c1a81ee44e582ce3bb3b64cbb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13af49ba7abbe43edde46130d212f6d76f30bab85a63aef18a84721887e29535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e22795f3990f90b901e1ba026e5f95e72508dc865635d19ba881a291dbaed85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9296772284fdb3e0ac56ffd5e54f10e7fe1256ccb616d52f103479ebcc6e81f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ace6ee814e8e429b6ec77e88a0bb2fff35cfbfd3497e7a4d7882e11f437ce5a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b63e56e590054687fedf2ed3e34d18b7fe5b675ab245cc644b18461752082d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33bb9f09621d4a84e0041c922049323caf0373f0ef87a05bba0d4237b8782666\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-16T19:42:32Z\\\",\\\"message\\\":\\\"9:42:32.347198 5892 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0216 19:42:32.347212 5892 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0216 19:42:32.347197 5892 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0216 19:42:32.347290 5892 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0216 19:42:32.347359 5892 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0216 19:42:32.347440 5892 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0216 19:42:32.347462 5892 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0216 19:42:32.347442 5892 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0216 19:42:32.347508 5892 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0216 19:42:32.347556 5892 handler.go:208] Removed *v1.Node event handler 7\\\\nI0216 19:42:32.347596 5892 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0216 19:42:32.347620 5892 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0216 19:42:32.347640 5892 handler.go:208] Removed *v1.Node event handler 2\\\\nI0216 19:42:32.347664 5892 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0216 19:42:32.347716 5892 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0216 19:42:32.347718 5892 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b63e56e590054687fedf2ed3e34d18b7fe5b675ab245cc644b18461752082d9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-16T19:42:34Z\\\",\\\"message\\\":\\\"per:true service.alpha.openshift.io/serving-cert-secret-name:config-operator-serving-cert service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc006f7818f \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 8443 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app: openshift-config-operator,},ClusterIP:10.217.4.161,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.161],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0216 19:42:34.165060 6088 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI0216 19:42:34.165067 6088 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4a449e69885bb506900d7dbb14ee3ff9eb2d1188267babb04fbe0bb8f057c48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gpc5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:38Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:38 crc kubenswrapper[4675]: I0216 19:42:38.059866 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10414964-83d0-4d95-a89f-e3212a8015b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db3c34dee6cb538e5de05d8703350e3013401589489756aafbf8fed2ef597d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27l4j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92e33c01bc9214841f42d4ca6e58fb062a9e94fe39bdc5ad08ad2351cd270058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27l4j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j7pnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:38Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:38 crc kubenswrapper[4675]: I0216 19:42:38.081477 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc6bac98-79c6-4970-ba8b-7996a4046f7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055f2d1052fed3e5e8d660243c3d0595e3168a670ea7a22ef51b199528ffe826\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beefddcc0b5e2e84c15063f467b9043ba9caea11fbc3f7fb8fc1ece3b8c8f75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dbd2e87a95c7964a1bafb765ec76d7f18e180cc0e47df9fde7b526b3717a107\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cc4f98152f284333d41071adaefada973e3396c3cdb2c6ee3c662f0043fe65b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c762d1facdba3325880ae6b1e2704c626ae67915808b527959795774e2a1ba62\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-16T19:42:02Z\\\",\\\"message\\\":\\\"W0216 19:42:01.228572 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0216 19:42:01.228958 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771270921 cert, and key in /tmp/serving-cert-1169484463/serving-signer.crt, /tmp/serving-cert-1169484463/serving-signer.key\\\\nI0216 19:42:01.683881 1 observer_polling.go:159] Starting file observer\\\\nW0216 19:42:01.687387 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0216 19:42:01.693158 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0216 19:42:01.694799 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1169484463/tls.crt::/tmp/serving-cert-1169484463/tls.key\\\\\\\"\\\\nF0216 19:42:02.048422 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://344f8bd7bee340cc040a2556a0e92c055ed7fbad886f85750b66d21ace766e8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:38Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:38 crc kubenswrapper[4675]: I0216 19:42:38.098310 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18c7d7a1f00611bdd6074314992cb6601ce0c043b831b6125010fe098c7e12ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:38Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:38 crc kubenswrapper[4675]: I0216 19:42:38.109722 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:38 crc kubenswrapper[4675]: I0216 19:42:38.109768 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:38 crc kubenswrapper[4675]: I0216 19:42:38.109783 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:38 crc kubenswrapper[4675]: I0216 19:42:38.109806 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:38 crc kubenswrapper[4675]: I0216 19:42:38.109819 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:38Z","lastTransitionTime":"2026-02-16T19:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:38 crc kubenswrapper[4675]: I0216 19:42:38.116387 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:38Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:38 crc kubenswrapper[4675]: I0216 19:42:38.141280 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rg8bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f385df3d-0543-4189-88a6-2163c8b9b959\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f034148fa1b45c35dd6c8b9b70c08b1d745d1377335d7f6d2b35793c29ce9e03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://993eb6f1a848a5279ad2e5d47f47c873918dbca13d2ff7db80bebbad6f948476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://993eb6f1a848a5279ad2e5d47f47c873918dbca13d2ff7db80bebbad6f948476\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02bb51b679cec1e9962a9d52b4fa890f31df7d9c00ed9f4885e94ac52455341c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02bb51b679cec1e9962a9d52b4fa890f31df7d9c00ed9f4885e94ac52455341c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0161d628c911b5e33369708f774f6f8313fb74d7fdc1266b7bfa346d5d6200a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0161d628c911b5e33369708f774f6f8313fb74d7fdc1266b7bfa346d5d6200a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://470da14b61b6a4b60e151726b8a40431c5a38f30416cb9e635e843541e8b28bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://470da14b61b6a4b60e151726b8a40431c5a38f30416cb9e635e843541e8b28bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd74b2f78bf4719b66ec38da6783f0e4ee8b93d27d6c27907212e906c1cd8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbd74b2f78bf4719b66ec38da6783f0e4ee8b93d27d6c27907212e906c1cd8e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f742c6f36d6a537e143339eaa367ff65398f680da2e175aeaed1a4da1886148c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f742c6f36d6a537e143339eaa367ff65398f680da2e175aeaed1a4da1886148c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rg8bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:38Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:38 crc kubenswrapper[4675]: I0216 19:42:38.157139 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4vvwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2de9472b-0e23-4821-86b8-202bfd739aee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2a601bdf26e518854cf3a55f3aed6274701bc40b20fe0f0e7673e65a007f0b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqw54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4vvwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:38Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:38 crc kubenswrapper[4675]: I0216 19:42:38.173322 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"776ad1bc-395c-41c3-8c95-37a58a8cd4e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04dbf4934ecd5c1773fe7c1d032d68d59f41556af022bdf1c36f8de85221616a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abb4dfa515be84263cef6de8c300018193409908292cbfe0d19610ade9532ad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8d3160c906648c15ea910b1b2fa9223c716bbefac62b25bba4f83f3b50217b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd0dd7a8d986a0dd78568e33afa36fcd8bba88cf885d29b22484132bf0034c47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:38Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:38 crc kubenswrapper[4675]: I0216 19:42:38.187800 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04c59a1a3ae53156b955a90113bc3d3b4424b7dcc3baa91e8256fd0893751af8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:38Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:38 crc kubenswrapper[4675]: I0216 19:42:38.201311 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sbgjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d5a5b47-38a4-4f7e-b40e-dba4825e18be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm4b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm4b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sbgjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:38Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:38 crc kubenswrapper[4675]: I0216 19:42:38.212534 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:38 crc kubenswrapper[4675]: I0216 19:42:38.212576 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:38 crc kubenswrapper[4675]: I0216 19:42:38.212587 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:38 crc kubenswrapper[4675]: I0216 19:42:38.212604 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:38 crc kubenswrapper[4675]: I0216 19:42:38.212615 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:38Z","lastTransitionTime":"2026-02-16T19:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:38 crc kubenswrapper[4675]: I0216 19:42:38.315432 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:38 crc kubenswrapper[4675]: I0216 19:42:38.315482 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:38 crc kubenswrapper[4675]: I0216 19:42:38.315500 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:38 crc kubenswrapper[4675]: I0216 19:42:38.315521 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:38 crc kubenswrapper[4675]: I0216 19:42:38.315533 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:38Z","lastTransitionTime":"2026-02-16T19:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:38 crc kubenswrapper[4675]: I0216 19:42:38.418293 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:38 crc kubenswrapper[4675]: I0216 19:42:38.418577 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:38 crc kubenswrapper[4675]: I0216 19:42:38.418646 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:38 crc kubenswrapper[4675]: I0216 19:42:38.418744 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:38 crc kubenswrapper[4675]: I0216 19:42:38.418822 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:38Z","lastTransitionTime":"2026-02-16T19:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:38 crc kubenswrapper[4675]: I0216 19:42:38.521310 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:38 crc kubenswrapper[4675]: I0216 19:42:38.521355 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:38 crc kubenswrapper[4675]: I0216 19:42:38.521366 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:38 crc kubenswrapper[4675]: I0216 19:42:38.521385 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:38 crc kubenswrapper[4675]: I0216 19:42:38.521396 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:38Z","lastTransitionTime":"2026-02-16T19:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:38 crc kubenswrapper[4675]: I0216 19:42:38.624501 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:38 crc kubenswrapper[4675]: I0216 19:42:38.624555 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:38 crc kubenswrapper[4675]: I0216 19:42:38.624566 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:38 crc kubenswrapper[4675]: I0216 19:42:38.624587 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:38 crc kubenswrapper[4675]: I0216 19:42:38.624600 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:38Z","lastTransitionTime":"2026-02-16T19:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:38 crc kubenswrapper[4675]: I0216 19:42:38.727068 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:38 crc kubenswrapper[4675]: I0216 19:42:38.727109 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:38 crc kubenswrapper[4675]: I0216 19:42:38.727119 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:38 crc kubenswrapper[4675]: I0216 19:42:38.727138 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:38 crc kubenswrapper[4675]: I0216 19:42:38.727149 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:38Z","lastTransitionTime":"2026-02-16T19:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:38 crc kubenswrapper[4675]: I0216 19:42:38.812137 4675 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 15:37:13.856844565 +0000 UTC Feb 16 19:42:38 crc kubenswrapper[4675]: I0216 19:42:38.830786 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:38 crc kubenswrapper[4675]: I0216 19:42:38.830837 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:38 crc kubenswrapper[4675]: I0216 19:42:38.830848 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:38 crc kubenswrapper[4675]: I0216 19:42:38.830865 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:38 crc kubenswrapper[4675]: I0216 19:42:38.830876 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:38Z","lastTransitionTime":"2026-02-16T19:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:38 crc kubenswrapper[4675]: I0216 19:42:38.884140 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 19:42:38 crc kubenswrapper[4675]: I0216 19:42:38.884208 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 19:42:38 crc kubenswrapper[4675]: E0216 19:42:38.884317 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 19:42:38 crc kubenswrapper[4675]: I0216 19:42:38.884153 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbgjb" Feb 16 19:42:38 crc kubenswrapper[4675]: E0216 19:42:38.884462 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 19:42:38 crc kubenswrapper[4675]: I0216 19:42:38.884512 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 19:42:38 crc kubenswrapper[4675]: E0216 19:42:38.884561 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbgjb" podUID="8d5a5b47-38a4-4f7e-b40e-dba4825e18be" Feb 16 19:42:38 crc kubenswrapper[4675]: E0216 19:42:38.884602 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 19:42:38 crc kubenswrapper[4675]: I0216 19:42:38.901229 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8d5a5b47-38a4-4f7e-b40e-dba4825e18be-metrics-certs\") pod \"network-metrics-daemon-sbgjb\" (UID: \"8d5a5b47-38a4-4f7e-b40e-dba4825e18be\") " pod="openshift-multus/network-metrics-daemon-sbgjb" Feb 16 19:42:38 crc kubenswrapper[4675]: E0216 19:42:38.901381 4675 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 16 19:42:38 crc kubenswrapper[4675]: E0216 19:42:38.901436 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8d5a5b47-38a4-4f7e-b40e-dba4825e18be-metrics-certs podName:8d5a5b47-38a4-4f7e-b40e-dba4825e18be nodeName:}" failed. No retries permitted until 2026-02-16 19:42:42.901419607 +0000 UTC m=+46.026709163 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8d5a5b47-38a4-4f7e-b40e-dba4825e18be-metrics-certs") pod "network-metrics-daemon-sbgjb" (UID: "8d5a5b47-38a4-4f7e-b40e-dba4825e18be") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 16 19:42:38 crc kubenswrapper[4675]: I0216 19:42:38.934015 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:38 crc kubenswrapper[4675]: I0216 19:42:38.934070 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:38 crc kubenswrapper[4675]: I0216 19:42:38.934087 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:38 crc kubenswrapper[4675]: I0216 19:42:38.934110 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:38 crc kubenswrapper[4675]: I0216 19:42:38.934122 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:38Z","lastTransitionTime":"2026-02-16T19:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:39 crc kubenswrapper[4675]: I0216 19:42:39.037372 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:39 crc kubenswrapper[4675]: I0216 19:42:39.037440 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:39 crc kubenswrapper[4675]: I0216 19:42:39.037454 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:39 crc kubenswrapper[4675]: I0216 19:42:39.037479 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:39 crc kubenswrapper[4675]: I0216 19:42:39.037498 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:39Z","lastTransitionTime":"2026-02-16T19:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:39 crc kubenswrapper[4675]: I0216 19:42:39.140955 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:39 crc kubenswrapper[4675]: I0216 19:42:39.141003 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:39 crc kubenswrapper[4675]: I0216 19:42:39.141023 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:39 crc kubenswrapper[4675]: I0216 19:42:39.141043 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:39 crc kubenswrapper[4675]: I0216 19:42:39.141056 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:39Z","lastTransitionTime":"2026-02-16T19:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:39 crc kubenswrapper[4675]: I0216 19:42:39.244263 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:39 crc kubenswrapper[4675]: I0216 19:42:39.244341 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:39 crc kubenswrapper[4675]: I0216 19:42:39.244365 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:39 crc kubenswrapper[4675]: I0216 19:42:39.244402 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:39 crc kubenswrapper[4675]: I0216 19:42:39.244428 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:39Z","lastTransitionTime":"2026-02-16T19:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:39 crc kubenswrapper[4675]: I0216 19:42:39.347429 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:39 crc kubenswrapper[4675]: I0216 19:42:39.347535 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:39 crc kubenswrapper[4675]: I0216 19:42:39.347562 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:39 crc kubenswrapper[4675]: I0216 19:42:39.347599 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:39 crc kubenswrapper[4675]: I0216 19:42:39.347623 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:39Z","lastTransitionTime":"2026-02-16T19:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:39 crc kubenswrapper[4675]: I0216 19:42:39.452161 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:39 crc kubenswrapper[4675]: I0216 19:42:39.452226 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:39 crc kubenswrapper[4675]: I0216 19:42:39.452239 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:39 crc kubenswrapper[4675]: I0216 19:42:39.452265 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:39 crc kubenswrapper[4675]: I0216 19:42:39.452282 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:39Z","lastTransitionTime":"2026-02-16T19:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:39 crc kubenswrapper[4675]: I0216 19:42:39.556041 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:39 crc kubenswrapper[4675]: I0216 19:42:39.556114 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:39 crc kubenswrapper[4675]: I0216 19:42:39.556134 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:39 crc kubenswrapper[4675]: I0216 19:42:39.556169 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:39 crc kubenswrapper[4675]: I0216 19:42:39.556189 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:39Z","lastTransitionTime":"2026-02-16T19:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:39 crc kubenswrapper[4675]: I0216 19:42:39.659431 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:39 crc kubenswrapper[4675]: I0216 19:42:39.659496 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:39 crc kubenswrapper[4675]: I0216 19:42:39.659514 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:39 crc kubenswrapper[4675]: I0216 19:42:39.659544 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:39 crc kubenswrapper[4675]: I0216 19:42:39.659564 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:39Z","lastTransitionTime":"2026-02-16T19:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:39 crc kubenswrapper[4675]: I0216 19:42:39.763110 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:39 crc kubenswrapper[4675]: I0216 19:42:39.763206 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:39 crc kubenswrapper[4675]: I0216 19:42:39.763248 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:39 crc kubenswrapper[4675]: I0216 19:42:39.763276 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:39 crc kubenswrapper[4675]: I0216 19:42:39.763288 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:39Z","lastTransitionTime":"2026-02-16T19:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:39 crc kubenswrapper[4675]: I0216 19:42:39.812481 4675 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 09:32:06.539238348 +0000 UTC Feb 16 19:42:39 crc kubenswrapper[4675]: I0216 19:42:39.866604 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:39 crc kubenswrapper[4675]: I0216 19:42:39.866655 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:39 crc kubenswrapper[4675]: I0216 19:42:39.866668 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:39 crc kubenswrapper[4675]: I0216 19:42:39.866713 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:39 crc kubenswrapper[4675]: I0216 19:42:39.866736 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:39Z","lastTransitionTime":"2026-02-16T19:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:39 crc kubenswrapper[4675]: I0216 19:42:39.969780 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:39 crc kubenswrapper[4675]: I0216 19:42:39.969832 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:39 crc kubenswrapper[4675]: I0216 19:42:39.969841 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:39 crc kubenswrapper[4675]: I0216 19:42:39.969859 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:39 crc kubenswrapper[4675]: I0216 19:42:39.969875 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:39Z","lastTransitionTime":"2026-02-16T19:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:40 crc kubenswrapper[4675]: I0216 19:42:40.072947 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:40 crc kubenswrapper[4675]: I0216 19:42:40.073014 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:40 crc kubenswrapper[4675]: I0216 19:42:40.073028 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:40 crc kubenswrapper[4675]: I0216 19:42:40.073049 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:40 crc kubenswrapper[4675]: I0216 19:42:40.073070 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:40Z","lastTransitionTime":"2026-02-16T19:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:40 crc kubenswrapper[4675]: I0216 19:42:40.175815 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:40 crc kubenswrapper[4675]: I0216 19:42:40.175881 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:40 crc kubenswrapper[4675]: I0216 19:42:40.175909 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:40 crc kubenswrapper[4675]: I0216 19:42:40.175940 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:40 crc kubenswrapper[4675]: I0216 19:42:40.175960 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:40Z","lastTransitionTime":"2026-02-16T19:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:40 crc kubenswrapper[4675]: I0216 19:42:40.279324 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:40 crc kubenswrapper[4675]: I0216 19:42:40.279392 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:40 crc kubenswrapper[4675]: I0216 19:42:40.279410 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:40 crc kubenswrapper[4675]: I0216 19:42:40.279437 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:40 crc kubenswrapper[4675]: I0216 19:42:40.279459 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:40Z","lastTransitionTime":"2026-02-16T19:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:40 crc kubenswrapper[4675]: I0216 19:42:40.382870 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:40 crc kubenswrapper[4675]: I0216 19:42:40.382920 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:40 crc kubenswrapper[4675]: I0216 19:42:40.382931 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:40 crc kubenswrapper[4675]: I0216 19:42:40.382951 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:40 crc kubenswrapper[4675]: I0216 19:42:40.382962 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:40Z","lastTransitionTime":"2026-02-16T19:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:40 crc kubenswrapper[4675]: I0216 19:42:40.486310 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:40 crc kubenswrapper[4675]: I0216 19:42:40.486387 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:40 crc kubenswrapper[4675]: I0216 19:42:40.486406 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:40 crc kubenswrapper[4675]: I0216 19:42:40.486440 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:40 crc kubenswrapper[4675]: I0216 19:42:40.486460 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:40Z","lastTransitionTime":"2026-02-16T19:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:40 crc kubenswrapper[4675]: I0216 19:42:40.588831 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:40 crc kubenswrapper[4675]: I0216 19:42:40.588905 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:40 crc kubenswrapper[4675]: I0216 19:42:40.588925 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:40 crc kubenswrapper[4675]: I0216 19:42:40.588952 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:40 crc kubenswrapper[4675]: I0216 19:42:40.588977 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:40Z","lastTransitionTime":"2026-02-16T19:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:40 crc kubenswrapper[4675]: I0216 19:42:40.692041 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:40 crc kubenswrapper[4675]: I0216 19:42:40.692113 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:40 crc kubenswrapper[4675]: I0216 19:42:40.692136 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:40 crc kubenswrapper[4675]: I0216 19:42:40.692168 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:40 crc kubenswrapper[4675]: I0216 19:42:40.692191 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:40Z","lastTransitionTime":"2026-02-16T19:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:40 crc kubenswrapper[4675]: I0216 19:42:40.795421 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:40 crc kubenswrapper[4675]: I0216 19:42:40.795505 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:40 crc kubenswrapper[4675]: I0216 19:42:40.795531 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:40 crc kubenswrapper[4675]: I0216 19:42:40.795561 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:40 crc kubenswrapper[4675]: I0216 19:42:40.795586 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:40Z","lastTransitionTime":"2026-02-16T19:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:40 crc kubenswrapper[4675]: I0216 19:42:40.812977 4675 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 08:10:02.837161677 +0000 UTC Feb 16 19:42:40 crc kubenswrapper[4675]: I0216 19:42:40.883975 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 19:42:40 crc kubenswrapper[4675]: I0216 19:42:40.884000 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 19:42:40 crc kubenswrapper[4675]: I0216 19:42:40.884035 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbgjb" Feb 16 19:42:40 crc kubenswrapper[4675]: I0216 19:42:40.884066 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 19:42:40 crc kubenswrapper[4675]: E0216 19:42:40.884508 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 19:42:40 crc kubenswrapper[4675]: E0216 19:42:40.884777 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 19:42:40 crc kubenswrapper[4675]: E0216 19:42:40.884884 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 19:42:40 crc kubenswrapper[4675]: E0216 19:42:40.885065 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbgjb" podUID="8d5a5b47-38a4-4f7e-b40e-dba4825e18be" Feb 16 19:42:40 crc kubenswrapper[4675]: I0216 19:42:40.898463 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:40 crc kubenswrapper[4675]: I0216 19:42:40.898622 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:40 crc kubenswrapper[4675]: I0216 19:42:40.898725 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:40 crc kubenswrapper[4675]: I0216 19:42:40.898867 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:40 crc kubenswrapper[4675]: I0216 19:42:40.898943 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:40Z","lastTransitionTime":"2026-02-16T19:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:41 crc kubenswrapper[4675]: I0216 19:42:41.001618 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:41 crc kubenswrapper[4675]: I0216 19:42:41.002108 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:41 crc kubenswrapper[4675]: I0216 19:42:41.002270 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:41 crc kubenswrapper[4675]: I0216 19:42:41.002415 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:41 crc kubenswrapper[4675]: I0216 19:42:41.002507 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:41Z","lastTransitionTime":"2026-02-16T19:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:41 crc kubenswrapper[4675]: I0216 19:42:41.105551 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:41 crc kubenswrapper[4675]: I0216 19:42:41.106073 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:41 crc kubenswrapper[4675]: I0216 19:42:41.106338 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:41 crc kubenswrapper[4675]: I0216 19:42:41.106547 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:41 crc kubenswrapper[4675]: I0216 19:42:41.106788 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:41Z","lastTransitionTime":"2026-02-16T19:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:41 crc kubenswrapper[4675]: I0216 19:42:41.209889 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:41 crc kubenswrapper[4675]: I0216 19:42:41.209982 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:41 crc kubenswrapper[4675]: I0216 19:42:41.210010 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:41 crc kubenswrapper[4675]: I0216 19:42:41.210055 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:41 crc kubenswrapper[4675]: I0216 19:42:41.210078 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:41Z","lastTransitionTime":"2026-02-16T19:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:41 crc kubenswrapper[4675]: I0216 19:42:41.313608 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:41 crc kubenswrapper[4675]: I0216 19:42:41.313678 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:41 crc kubenswrapper[4675]: I0216 19:42:41.313734 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:41 crc kubenswrapper[4675]: I0216 19:42:41.313764 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:41 crc kubenswrapper[4675]: I0216 19:42:41.313783 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:41Z","lastTransitionTime":"2026-02-16T19:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:41 crc kubenswrapper[4675]: I0216 19:42:41.417547 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:41 crc kubenswrapper[4675]: I0216 19:42:41.417623 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:41 crc kubenswrapper[4675]: I0216 19:42:41.417642 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:41 crc kubenswrapper[4675]: I0216 19:42:41.417669 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:41 crc kubenswrapper[4675]: I0216 19:42:41.417719 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:41Z","lastTransitionTime":"2026-02-16T19:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:41 crc kubenswrapper[4675]: I0216 19:42:41.521079 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:41 crc kubenswrapper[4675]: I0216 19:42:41.521135 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:41 crc kubenswrapper[4675]: I0216 19:42:41.521147 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:41 crc kubenswrapper[4675]: I0216 19:42:41.521168 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:41 crc kubenswrapper[4675]: I0216 19:42:41.521180 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:41Z","lastTransitionTime":"2026-02-16T19:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:41 crc kubenswrapper[4675]: I0216 19:42:41.623634 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:41 crc kubenswrapper[4675]: I0216 19:42:41.624097 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:41 crc kubenswrapper[4675]: I0216 19:42:41.624239 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:41 crc kubenswrapper[4675]: I0216 19:42:41.624341 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:41 crc kubenswrapper[4675]: I0216 19:42:41.624448 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:41Z","lastTransitionTime":"2026-02-16T19:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:41 crc kubenswrapper[4675]: I0216 19:42:41.728028 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:41 crc kubenswrapper[4675]: I0216 19:42:41.728099 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:41 crc kubenswrapper[4675]: I0216 19:42:41.728119 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:41 crc kubenswrapper[4675]: I0216 19:42:41.728149 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:41 crc kubenswrapper[4675]: I0216 19:42:41.728169 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:41Z","lastTransitionTime":"2026-02-16T19:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:41 crc kubenswrapper[4675]: I0216 19:42:41.813460 4675 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 23:27:19.609621507 +0000 UTC Feb 16 19:42:41 crc kubenswrapper[4675]: I0216 19:42:41.831784 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:41 crc kubenswrapper[4675]: I0216 19:42:41.832178 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:41 crc kubenswrapper[4675]: I0216 19:42:41.832366 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:41 crc kubenswrapper[4675]: I0216 19:42:41.832560 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:41 crc kubenswrapper[4675]: I0216 19:42:41.832740 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:41Z","lastTransitionTime":"2026-02-16T19:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:41 crc kubenswrapper[4675]: I0216 19:42:41.936618 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:41 crc kubenswrapper[4675]: I0216 19:42:41.937196 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:41 crc kubenswrapper[4675]: I0216 19:42:41.937357 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:41 crc kubenswrapper[4675]: I0216 19:42:41.937495 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:41 crc kubenswrapper[4675]: I0216 19:42:41.937731 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:41Z","lastTransitionTime":"2026-02-16T19:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:42 crc kubenswrapper[4675]: I0216 19:42:42.041325 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:42 crc kubenswrapper[4675]: I0216 19:42:42.041384 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:42 crc kubenswrapper[4675]: I0216 19:42:42.041397 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:42 crc kubenswrapper[4675]: I0216 19:42:42.041423 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:42 crc kubenswrapper[4675]: I0216 19:42:42.041439 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:42Z","lastTransitionTime":"2026-02-16T19:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:42 crc kubenswrapper[4675]: I0216 19:42:42.144643 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:42 crc kubenswrapper[4675]: I0216 19:42:42.144714 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:42 crc kubenswrapper[4675]: I0216 19:42:42.144725 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:42 crc kubenswrapper[4675]: I0216 19:42:42.144745 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:42 crc kubenswrapper[4675]: I0216 19:42:42.144758 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:42Z","lastTransitionTime":"2026-02-16T19:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:42 crc kubenswrapper[4675]: I0216 19:42:42.248160 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:42 crc kubenswrapper[4675]: I0216 19:42:42.248226 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:42 crc kubenswrapper[4675]: I0216 19:42:42.248243 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:42 crc kubenswrapper[4675]: I0216 19:42:42.248281 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:42 crc kubenswrapper[4675]: I0216 19:42:42.248298 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:42Z","lastTransitionTime":"2026-02-16T19:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:42 crc kubenswrapper[4675]: I0216 19:42:42.352081 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:42 crc kubenswrapper[4675]: I0216 19:42:42.352143 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:42 crc kubenswrapper[4675]: I0216 19:42:42.352161 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:42 crc kubenswrapper[4675]: I0216 19:42:42.352186 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:42 crc kubenswrapper[4675]: I0216 19:42:42.352203 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:42Z","lastTransitionTime":"2026-02-16T19:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:42 crc kubenswrapper[4675]: I0216 19:42:42.455418 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:42 crc kubenswrapper[4675]: I0216 19:42:42.455473 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:42 crc kubenswrapper[4675]: I0216 19:42:42.455484 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:42 crc kubenswrapper[4675]: I0216 19:42:42.455503 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:42 crc kubenswrapper[4675]: I0216 19:42:42.455516 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:42Z","lastTransitionTime":"2026-02-16T19:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:42 crc kubenswrapper[4675]: I0216 19:42:42.559362 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:42 crc kubenswrapper[4675]: I0216 19:42:42.559416 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:42 crc kubenswrapper[4675]: I0216 19:42:42.559431 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:42 crc kubenswrapper[4675]: I0216 19:42:42.559452 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:42 crc kubenswrapper[4675]: I0216 19:42:42.559462 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:42Z","lastTransitionTime":"2026-02-16T19:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:42 crc kubenswrapper[4675]: I0216 19:42:42.662660 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:42 crc kubenswrapper[4675]: I0216 19:42:42.663042 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:42 crc kubenswrapper[4675]: I0216 19:42:42.663183 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:42 crc kubenswrapper[4675]: I0216 19:42:42.663338 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:42 crc kubenswrapper[4675]: I0216 19:42:42.663494 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:42Z","lastTransitionTime":"2026-02-16T19:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:42 crc kubenswrapper[4675]: I0216 19:42:42.766937 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:42 crc kubenswrapper[4675]: I0216 19:42:42.766986 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:42 crc kubenswrapper[4675]: I0216 19:42:42.767003 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:42 crc kubenswrapper[4675]: I0216 19:42:42.767029 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:42 crc kubenswrapper[4675]: I0216 19:42:42.767047 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:42Z","lastTransitionTime":"2026-02-16T19:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:42 crc kubenswrapper[4675]: I0216 19:42:42.814798 4675 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 14:00:18.128765981 +0000 UTC Feb 16 19:42:42 crc kubenswrapper[4675]: I0216 19:42:42.870464 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:42 crc kubenswrapper[4675]: I0216 19:42:42.870515 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:42 crc kubenswrapper[4675]: I0216 19:42:42.870526 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:42 crc kubenswrapper[4675]: I0216 19:42:42.870543 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:42 crc kubenswrapper[4675]: I0216 19:42:42.870554 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:42Z","lastTransitionTime":"2026-02-16T19:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:42 crc kubenswrapper[4675]: I0216 19:42:42.883831 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 19:42:42 crc kubenswrapper[4675]: I0216 19:42:42.883873 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 19:42:42 crc kubenswrapper[4675]: I0216 19:42:42.883895 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbgjb" Feb 16 19:42:42 crc kubenswrapper[4675]: E0216 19:42:42.884192 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 19:42:42 crc kubenswrapper[4675]: E0216 19:42:42.884192 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 19:42:42 crc kubenswrapper[4675]: E0216 19:42:42.884328 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbgjb" podUID="8d5a5b47-38a4-4f7e-b40e-dba4825e18be" Feb 16 19:42:42 crc kubenswrapper[4675]: I0216 19:42:42.884528 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 19:42:42 crc kubenswrapper[4675]: E0216 19:42:42.884740 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 19:42:42 crc kubenswrapper[4675]: I0216 19:42:42.904083 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" Feb 16 19:42:42 crc kubenswrapper[4675]: I0216 19:42:42.905405 4675 scope.go:117] "RemoveContainer" containerID="2b63e56e590054687fedf2ed3e34d18b7fe5b675ab245cc644b18461752082d9" Feb 16 19:42:42 crc kubenswrapper[4675]: E0216 19:42:42.905807 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-gpc5z_openshift-ovn-kubernetes(9b6e2d5a-0472-425b-b5b4-0b94f14ebfba)\"" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" podUID="9b6e2d5a-0472-425b-b5b4-0b94f14ebfba" Feb 16 19:42:42 crc kubenswrapper[4675]: I0216 19:42:42.926377 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:42Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:42 crc kubenswrapper[4675]: I0216 19:42:42.947165 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8d5a5b47-38a4-4f7e-b40e-dba4825e18be-metrics-certs\") pod \"network-metrics-daemon-sbgjb\" (UID: \"8d5a5b47-38a4-4f7e-b40e-dba4825e18be\") " pod="openshift-multus/network-metrics-daemon-sbgjb" Feb 16 19:42:42 crc kubenswrapper[4675]: E0216 19:42:42.947415 4675 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 16 19:42:42 crc kubenswrapper[4675]: E0216 19:42:42.947748 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8d5a5b47-38a4-4f7e-b40e-dba4825e18be-metrics-certs podName:8d5a5b47-38a4-4f7e-b40e-dba4825e18be nodeName:}" failed. No retries permitted until 2026-02-16 19:42:50.947720992 +0000 UTC m=+54.073010558 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8d5a5b47-38a4-4f7e-b40e-dba4825e18be-metrics-certs") pod "network-metrics-daemon-sbgjb" (UID: "8d5a5b47-38a4-4f7e-b40e-dba4825e18be") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 16 19:42:42 crc kubenswrapper[4675]: I0216 19:42:42.955864 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"358c3bab-4666-44d1-90fb-3affa22327e2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e0dcd3707a91b9a3869942408c144f0b4ca22cd64e99de5fe43f95094e44a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d3a4d507eb459edf9f2a3c58417ae320eba85a7b597808fdb0eabe7d7d5afbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd404f7efe7c09d47f0550b7cf66ac7193f315f60db74cd5609593546037c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bfda6b7519474256ab63750ba86955592a2e47a0e364e4170a19fe28270459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e9a715621e072260b3d9703159db5eaad4918ff7a2cb7689dcb914ccd617950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e8bfd47afcf3bbd1831b98e69a3ee5fe7b49dd04d4f2efc9d9dc8fda831c74a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e8bfd47afcf3bbd1831b98e69a3ee5fe7b49dd04d4f2efc9d9dc8fda831c74a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65752e9eeab5fea26d84c7d43a33947d3dbabdd31578480904db24d60d298a0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65752e9eeab5fea26d84c7d43a33947d3dbabdd31578480904db24d60d298a0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bd389e534c9b05c81db5cfb0a9f1f5e78e2f1ccbcce15953172836fb3fb6744f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd389e534c9b05c81db5cfb0a9f1f5e78e2f1ccbcce15953172836fb3fb6744f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:42Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:42 crc kubenswrapper[4675]: I0216 19:42:42.974119 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:42 crc kubenswrapper[4675]: I0216 19:42:42.974737 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:42 crc kubenswrapper[4675]: I0216 19:42:42.975050 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:42 crc kubenswrapper[4675]: I0216 19:42:42.975740 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:42 crc kubenswrapper[4675]: I0216 19:42:42.976210 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:42Z","lastTransitionTime":"2026-02-16T19:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:42 crc kubenswrapper[4675]: I0216 19:42:42.984730 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18c7d7a1f00611bdd6074314992cb6601ce0c043b831b6125010fe098c7e12ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:42Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:43 crc kubenswrapper[4675]: I0216 19:42:43.002160 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:42Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:43 crc kubenswrapper[4675]: I0216 19:42:43.019916 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:43Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:43 crc kubenswrapper[4675]: I0216 19:42:43.037858 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pj5xg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9a99563-d631-455f-8464-160e5619c610\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://798b2a0b186213c744675eb2bd1a33fcb05a8df10795373dbff1f91222fa17a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s94dl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pj5xg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:43Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:43 crc kubenswrapper[4675]: I0216 19:42:43.066885 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://958d71da673d02ed4c067f390a0a94c9f20052f6274ec975f9a4c3034efa7aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5282559b57497e15c0bd828702b4994e8d05f7c1a81ee44e582ce3bb3b64cbb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13af49ba7abbe43edde46130d212f6d76f30bab85a63aef18a84721887e29535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e22795f3990f90b901e1ba026e5f95e72508dc865635d19ba881a291dbaed85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9296772284fdb3e0ac56ffd5e54f10e7fe1256ccb616d52f103479ebcc6e81f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ace6ee814e8e429b6ec77e88a0bb2fff35cfbfd3497e7a4d7882e11f437ce5a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b63e56e590054687fedf2ed3e34d18b7fe5b675ab245cc644b18461752082d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b63e56e590054687fedf2ed3e34d18b7fe5b675ab245cc644b18461752082d9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-16T19:42:34Z\\\",\\\"message\\\":\\\"per:true service.alpha.openshift.io/serving-cert-secret-name:config-operator-serving-cert service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc006f7818f \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 8443 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app: openshift-config-operator,},ClusterIP:10.217.4.161,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.161],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0216 19:42:34.165060 6088 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI0216 19:42:34.165067 6088 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-gpc5z_openshift-ovn-kubernetes(9b6e2d5a-0472-425b-b5b4-0b94f14ebfba)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4a449e69885bb506900d7dbb14ee3ff9eb2d1188267babb04fbe0bb8f057c48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gpc5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:43Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:43 crc kubenswrapper[4675]: I0216 19:42:43.080064 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:43 crc kubenswrapper[4675]: I0216 19:42:43.080303 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:43 crc kubenswrapper[4675]: I0216 19:42:43.080420 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:43 crc kubenswrapper[4675]: I0216 19:42:43.080518 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:43 crc kubenswrapper[4675]: I0216 19:42:43.080603 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:43Z","lastTransitionTime":"2026-02-16T19:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:43 crc kubenswrapper[4675]: I0216 19:42:43.084544 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10414964-83d0-4d95-a89f-e3212a8015b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db3c34dee6cb538e5de05d8703350e3013401589489756aafbf8fed2ef597d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27l4j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92e33c01bc9214841f42d4ca6e58fb062a9e94fe39bdc5ad08ad2351cd270058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27l4j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j7pnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:43Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:43 crc kubenswrapper[4675]: I0216 19:42:43.102575 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc6bac98-79c6-4970-ba8b-7996a4046f7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055f2d1052fed3e5e8d660243c3d0595e3168a670ea7a22ef51b199528ffe826\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beefddcc0b5e2e84c15063f467b9043ba9caea11fbc3f7fb8fc1ece3b8c8f75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dbd2e87a95c7964a1bafb765ec76d7f18e180cc0e47df9fde7b526b3717a107\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cc4f98152f284333d41071adaefada973e3396c3cdb2c6ee3c662f0043fe65b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c762d1facdba3325880ae6b1e2704c626ae67915808b527959795774e2a1ba62\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-16T19:42:02Z\\\",\\\"message\\\":\\\"W0216 19:42:01.228572 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0216 19:42:01.228958 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771270921 cert, and key in /tmp/serving-cert-1169484463/serving-signer.crt, /tmp/serving-cert-1169484463/serving-signer.key\\\\nI0216 19:42:01.683881 1 observer_polling.go:159] Starting file observer\\\\nW0216 19:42:01.687387 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0216 19:42:01.693158 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0216 19:42:01.694799 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1169484463/tls.crt::/tmp/serving-cert-1169484463/tls.key\\\\\\\"\\\\nF0216 19:42:02.048422 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://344f8bd7bee340cc040a2556a0e92c055ed7fbad886f85750b66d21ace766e8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:43Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:43 crc kubenswrapper[4675]: I0216 19:42:43.120479 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rg8bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f385df3d-0543-4189-88a6-2163c8b9b959\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f034148fa1b45c35dd6c8b9b70c08b1d745d1377335d7f6d2b35793c29ce9e03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://993eb6f1a848a5279ad2e5d47f47c873918dbca13d2ff7db80bebbad6f948476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://993eb6f1a848a5279ad2e5d47f47c873918dbca13d2ff7db80bebbad6f948476\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02bb51b679cec1e9962a9d52b4fa890f31df7d9c00ed9f4885e94ac52455341c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02bb51b679cec1e9962a9d52b4fa890f31df7d9c00ed9f4885e94ac52455341c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0161d628c911b5e33369708f774f6f8313fb74d7fdc1266b7bfa346d5d6200a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0161d628c911b5e33369708f774f6f8313fb74d7fdc1266b7bfa346d5d6200a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://470da14b61b6a4b60e151726b8a40431c5a38f30416cb9e635e843541e8b28bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://470da14b61b6a4b60e151726b8a40431c5a38f30416cb9e635e843541e8b28bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd74b2f78bf4719b66ec38da6783f0e4ee8b93d27d6c27907212e906c1cd8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbd74b2f78bf4719b66ec38da6783f0e4ee8b93d27d6c27907212e906c1cd8e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f742c6f36d6a537e143339eaa367ff65398f680da2e175aeaed1a4da1886148c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f742c6f36d6a537e143339eaa367ff65398f680da2e175aeaed1a4da1886148c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rg8bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:43Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:43 crc kubenswrapper[4675]: I0216 19:42:43.132977 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4vvwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2de9472b-0e23-4821-86b8-202bfd739aee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2a601bdf26e518854cf3a55f3aed6274701bc40b20fe0f0e7673e65a007f0b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqw54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4vvwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:43Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:43 crc kubenswrapper[4675]: I0216 19:42:43.148188 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sbgjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d5a5b47-38a4-4f7e-b40e-dba4825e18be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm4b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm4b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sbgjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:43Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:43 crc kubenswrapper[4675]: I0216 19:42:43.168286 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"776ad1bc-395c-41c3-8c95-37a58a8cd4e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04dbf4934ecd5c1773fe7c1d032d68d59f41556af022bdf1c36f8de85221616a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abb4dfa515be84263cef6de8c300018193409908292cbfe0d19610ade9532ad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8d3160c906648c15ea910b1b2fa9223c716bbefac62b25bba4f83f3b50217b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd0dd7a8d986a0dd78568e33afa36fcd8bba88cf885d29b22484132bf0034c47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:43Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:43 crc kubenswrapper[4675]: I0216 19:42:43.183627 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:43 crc kubenswrapper[4675]: I0216 19:42:43.183679 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:43 crc kubenswrapper[4675]: I0216 19:42:43.183705 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:43 crc kubenswrapper[4675]: I0216 19:42:43.183730 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:43 crc kubenswrapper[4675]: I0216 19:42:43.183742 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:43Z","lastTransitionTime":"2026-02-16T19:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:43 crc kubenswrapper[4675]: I0216 19:42:43.187202 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04c59a1a3ae53156b955a90113bc3d3b4424b7dcc3baa91e8256fd0893751af8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:43Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:43 crc kubenswrapper[4675]: I0216 19:42:43.202104 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h9s8p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e98c1ed8-cbed-4d30-b3fc-41beff5d65c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://410f4deab4b31698f80da30412dff846ca9f1004dfab5d4cfe67c6892b3a7aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df2gr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20a2a1cc0b69b60d38a38ab898236d8577ce735cd2e1698f6fd3cb3d3c23b62c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df2gr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-h9s8p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:43Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:43 crc kubenswrapper[4675]: I0216 19:42:43.219264 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10e2a1727464113d2cd88dfb8ba52bb1944b3dfe67a0d40f5b2e57fe483e5890\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83a4afeccd65f013b07c0f90cbb94f765f918aaaeeca716935f940deb85403d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:43Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:43 crc kubenswrapper[4675]: I0216 19:42:43.232770 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-stxk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af599569-feb0-432a-9adf-1b72d1ac1a57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5944f34bb922394197cfb79978d19139f70724115954043828e9f609ca32945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-stxk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:43Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:43 crc kubenswrapper[4675]: I0216 19:42:43.286454 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:43 crc kubenswrapper[4675]: I0216 19:42:43.286524 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:43 crc kubenswrapper[4675]: I0216 19:42:43.286535 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:43 crc kubenswrapper[4675]: I0216 19:42:43.286551 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:43 crc kubenswrapper[4675]: I0216 19:42:43.286560 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:43Z","lastTransitionTime":"2026-02-16T19:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:43 crc kubenswrapper[4675]: I0216 19:42:43.389855 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:43 crc kubenswrapper[4675]: I0216 19:42:43.389914 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:43 crc kubenswrapper[4675]: I0216 19:42:43.389927 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:43 crc kubenswrapper[4675]: I0216 19:42:43.389954 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:43 crc kubenswrapper[4675]: I0216 19:42:43.389971 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:43Z","lastTransitionTime":"2026-02-16T19:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:43 crc kubenswrapper[4675]: I0216 19:42:43.493321 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:43 crc kubenswrapper[4675]: I0216 19:42:43.493366 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:43 crc kubenswrapper[4675]: I0216 19:42:43.493377 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:43 crc kubenswrapper[4675]: I0216 19:42:43.493393 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:43 crc kubenswrapper[4675]: I0216 19:42:43.493416 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:43Z","lastTransitionTime":"2026-02-16T19:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:43 crc kubenswrapper[4675]: I0216 19:42:43.597421 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:43 crc kubenswrapper[4675]: I0216 19:42:43.598384 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:43 crc kubenswrapper[4675]: I0216 19:42:43.602880 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:43 crc kubenswrapper[4675]: I0216 19:42:43.602981 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:43 crc kubenswrapper[4675]: I0216 19:42:43.603001 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:43Z","lastTransitionTime":"2026-02-16T19:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:43 crc kubenswrapper[4675]: I0216 19:42:43.706460 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:43 crc kubenswrapper[4675]: I0216 19:42:43.706531 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:43 crc kubenswrapper[4675]: I0216 19:42:43.706553 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:43 crc kubenswrapper[4675]: I0216 19:42:43.706581 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:43 crc kubenswrapper[4675]: I0216 19:42:43.706599 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:43Z","lastTransitionTime":"2026-02-16T19:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:43 crc kubenswrapper[4675]: I0216 19:42:43.810229 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:43 crc kubenswrapper[4675]: I0216 19:42:43.810729 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:43 crc kubenswrapper[4675]: I0216 19:42:43.810917 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:43 crc kubenswrapper[4675]: I0216 19:42:43.811160 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:43 crc kubenswrapper[4675]: I0216 19:42:43.811344 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:43Z","lastTransitionTime":"2026-02-16T19:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:43 crc kubenswrapper[4675]: I0216 19:42:43.815478 4675 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 18:50:12.790079782 +0000 UTC Feb 16 19:42:43 crc kubenswrapper[4675]: I0216 19:42:43.914820 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:43 crc kubenswrapper[4675]: I0216 19:42:43.914882 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:43 crc kubenswrapper[4675]: I0216 19:42:43.914899 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:43 crc kubenswrapper[4675]: I0216 19:42:43.914916 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:43 crc kubenswrapper[4675]: I0216 19:42:43.914929 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:43Z","lastTransitionTime":"2026-02-16T19:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:44 crc kubenswrapper[4675]: I0216 19:42:44.022190 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:44 crc kubenswrapper[4675]: I0216 19:42:44.022259 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:44 crc kubenswrapper[4675]: I0216 19:42:44.022277 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:44 crc kubenswrapper[4675]: I0216 19:42:44.022301 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:44 crc kubenswrapper[4675]: I0216 19:42:44.022319 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:44Z","lastTransitionTime":"2026-02-16T19:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:44 crc kubenswrapper[4675]: I0216 19:42:44.125395 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:44 crc kubenswrapper[4675]: I0216 19:42:44.125456 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:44 crc kubenswrapper[4675]: I0216 19:42:44.125475 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:44 crc kubenswrapper[4675]: I0216 19:42:44.125499 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:44 crc kubenswrapper[4675]: I0216 19:42:44.125518 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:44Z","lastTransitionTime":"2026-02-16T19:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:44 crc kubenswrapper[4675]: I0216 19:42:44.229064 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:44 crc kubenswrapper[4675]: I0216 19:42:44.229163 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:44 crc kubenswrapper[4675]: I0216 19:42:44.229180 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:44 crc kubenswrapper[4675]: I0216 19:42:44.229225 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:44 crc kubenswrapper[4675]: I0216 19:42:44.229243 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:44Z","lastTransitionTime":"2026-02-16T19:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:44 crc kubenswrapper[4675]: I0216 19:42:44.333301 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:44 crc kubenswrapper[4675]: I0216 19:42:44.333356 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:44 crc kubenswrapper[4675]: I0216 19:42:44.333369 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:44 crc kubenswrapper[4675]: I0216 19:42:44.333391 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:44 crc kubenswrapper[4675]: I0216 19:42:44.333403 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:44Z","lastTransitionTime":"2026-02-16T19:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:44 crc kubenswrapper[4675]: I0216 19:42:44.436261 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:44 crc kubenswrapper[4675]: I0216 19:42:44.436307 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:44 crc kubenswrapper[4675]: I0216 19:42:44.436317 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:44 crc kubenswrapper[4675]: I0216 19:42:44.436332 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:44 crc kubenswrapper[4675]: I0216 19:42:44.436343 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:44Z","lastTransitionTime":"2026-02-16T19:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:44 crc kubenswrapper[4675]: I0216 19:42:44.539514 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:44 crc kubenswrapper[4675]: I0216 19:42:44.539575 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:44 crc kubenswrapper[4675]: I0216 19:42:44.539593 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:44 crc kubenswrapper[4675]: I0216 19:42:44.539615 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:44 crc kubenswrapper[4675]: I0216 19:42:44.539628 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:44Z","lastTransitionTime":"2026-02-16T19:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:44 crc kubenswrapper[4675]: I0216 19:42:44.642766 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:44 crc kubenswrapper[4675]: I0216 19:42:44.642823 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:44 crc kubenswrapper[4675]: I0216 19:42:44.642834 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:44 crc kubenswrapper[4675]: I0216 19:42:44.642852 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:44 crc kubenswrapper[4675]: I0216 19:42:44.642864 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:44Z","lastTransitionTime":"2026-02-16T19:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:44 crc kubenswrapper[4675]: I0216 19:42:44.746979 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:44 crc kubenswrapper[4675]: I0216 19:42:44.747036 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:44 crc kubenswrapper[4675]: I0216 19:42:44.747053 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:44 crc kubenswrapper[4675]: I0216 19:42:44.747081 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:44 crc kubenswrapper[4675]: I0216 19:42:44.747099 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:44Z","lastTransitionTime":"2026-02-16T19:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:44 crc kubenswrapper[4675]: I0216 19:42:44.815979 4675 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 21:43:00.670539463 +0000 UTC Feb 16 19:42:44 crc kubenswrapper[4675]: I0216 19:42:44.850412 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:44 crc kubenswrapper[4675]: I0216 19:42:44.850497 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:44 crc kubenswrapper[4675]: I0216 19:42:44.850526 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:44 crc kubenswrapper[4675]: I0216 19:42:44.850559 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:44 crc kubenswrapper[4675]: I0216 19:42:44.851531 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:44Z","lastTransitionTime":"2026-02-16T19:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:44 crc kubenswrapper[4675]: I0216 19:42:44.884161 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 19:42:44 crc kubenswrapper[4675]: I0216 19:42:44.884166 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 19:42:44 crc kubenswrapper[4675]: I0216 19:42:44.884316 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 19:42:44 crc kubenswrapper[4675]: E0216 19:42:44.884419 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 19:42:44 crc kubenswrapper[4675]: E0216 19:42:44.884551 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 19:42:44 crc kubenswrapper[4675]: E0216 19:42:44.884648 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 19:42:44 crc kubenswrapper[4675]: I0216 19:42:44.884771 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbgjb" Feb 16 19:42:44 crc kubenswrapper[4675]: E0216 19:42:44.884884 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbgjb" podUID="8d5a5b47-38a4-4f7e-b40e-dba4825e18be" Feb 16 19:42:44 crc kubenswrapper[4675]: I0216 19:42:44.955591 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:44 crc kubenswrapper[4675]: I0216 19:42:44.955744 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:44 crc kubenswrapper[4675]: I0216 19:42:44.955771 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:44 crc kubenswrapper[4675]: I0216 19:42:44.955805 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:44 crc kubenswrapper[4675]: I0216 19:42:44.955901 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:44Z","lastTransitionTime":"2026-02-16T19:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:45 crc kubenswrapper[4675]: I0216 19:42:45.058744 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:45 crc kubenswrapper[4675]: I0216 19:42:45.058849 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:45 crc kubenswrapper[4675]: I0216 19:42:45.058865 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:45 crc kubenswrapper[4675]: I0216 19:42:45.058933 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:45 crc kubenswrapper[4675]: I0216 19:42:45.058959 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:45Z","lastTransitionTime":"2026-02-16T19:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:45 crc kubenswrapper[4675]: I0216 19:42:45.162821 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:45 crc kubenswrapper[4675]: I0216 19:42:45.162893 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:45 crc kubenswrapper[4675]: I0216 19:42:45.162913 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:45 crc kubenswrapper[4675]: I0216 19:42:45.162943 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:45 crc kubenswrapper[4675]: I0216 19:42:45.162969 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:45Z","lastTransitionTime":"2026-02-16T19:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:45 crc kubenswrapper[4675]: I0216 19:42:45.267058 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:45 crc kubenswrapper[4675]: I0216 19:42:45.267212 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:45 crc kubenswrapper[4675]: I0216 19:42:45.267336 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:45 crc kubenswrapper[4675]: I0216 19:42:45.267427 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:45 crc kubenswrapper[4675]: I0216 19:42:45.267460 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:45Z","lastTransitionTime":"2026-02-16T19:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:45 crc kubenswrapper[4675]: I0216 19:42:45.371079 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:45 crc kubenswrapper[4675]: I0216 19:42:45.371123 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:45 crc kubenswrapper[4675]: I0216 19:42:45.371135 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:45 crc kubenswrapper[4675]: I0216 19:42:45.371149 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:45 crc kubenswrapper[4675]: I0216 19:42:45.371160 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:45Z","lastTransitionTime":"2026-02-16T19:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:45 crc kubenswrapper[4675]: I0216 19:42:45.474172 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:45 crc kubenswrapper[4675]: I0216 19:42:45.474241 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:45 crc kubenswrapper[4675]: I0216 19:42:45.474251 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:45 crc kubenswrapper[4675]: I0216 19:42:45.474271 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:45 crc kubenswrapper[4675]: I0216 19:42:45.474284 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:45Z","lastTransitionTime":"2026-02-16T19:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:45 crc kubenswrapper[4675]: I0216 19:42:45.577078 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:45 crc kubenswrapper[4675]: I0216 19:42:45.577153 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:45 crc kubenswrapper[4675]: I0216 19:42:45.577171 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:45 crc kubenswrapper[4675]: I0216 19:42:45.577201 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:45 crc kubenswrapper[4675]: I0216 19:42:45.577222 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:45Z","lastTransitionTime":"2026-02-16T19:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:45 crc kubenswrapper[4675]: I0216 19:42:45.612626 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:45 crc kubenswrapper[4675]: I0216 19:42:45.612730 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:45 crc kubenswrapper[4675]: I0216 19:42:45.612753 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:45 crc kubenswrapper[4675]: I0216 19:42:45.612778 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:45 crc kubenswrapper[4675]: I0216 19:42:45.612795 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:45Z","lastTransitionTime":"2026-02-16T19:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:45 crc kubenswrapper[4675]: E0216 19:42:45.635992 4675 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:42:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:42:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:42:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:42:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dc074e08-7429-4b93-941d-3d8335487f37\\\",\\\"systemUUID\\\":\\\"a98cec76-f68c-4af0-9517-b326e9616347\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:45Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:45 crc kubenswrapper[4675]: I0216 19:42:45.642599 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:45 crc kubenswrapper[4675]: I0216 19:42:45.642660 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:45 crc kubenswrapper[4675]: I0216 19:42:45.642674 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:45 crc kubenswrapper[4675]: I0216 19:42:45.642710 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:45 crc kubenswrapper[4675]: I0216 19:42:45.642724 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:45Z","lastTransitionTime":"2026-02-16T19:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:45 crc kubenswrapper[4675]: E0216 19:42:45.662313 4675 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:42:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:42:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:42:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:42:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dc074e08-7429-4b93-941d-3d8335487f37\\\",\\\"systemUUID\\\":\\\"a98cec76-f68c-4af0-9517-b326e9616347\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:45Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:45 crc kubenswrapper[4675]: I0216 19:42:45.666992 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:45 crc kubenswrapper[4675]: I0216 19:42:45.667019 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:45 crc kubenswrapper[4675]: I0216 19:42:45.667029 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:45 crc kubenswrapper[4675]: I0216 19:42:45.667049 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:45 crc kubenswrapper[4675]: I0216 19:42:45.667061 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:45Z","lastTransitionTime":"2026-02-16T19:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:45 crc kubenswrapper[4675]: E0216 19:42:45.688509 4675 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:42:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:42:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:42:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:42:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dc074e08-7429-4b93-941d-3d8335487f37\\\",\\\"systemUUID\\\":\\\"a98cec76-f68c-4af0-9517-b326e9616347\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:45Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:45 crc kubenswrapper[4675]: I0216 19:42:45.694158 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:45 crc kubenswrapper[4675]: I0216 19:42:45.694239 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:45 crc kubenswrapper[4675]: I0216 19:42:45.694249 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:45 crc kubenswrapper[4675]: I0216 19:42:45.694267 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:45 crc kubenswrapper[4675]: I0216 19:42:45.694279 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:45Z","lastTransitionTime":"2026-02-16T19:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:45 crc kubenswrapper[4675]: E0216 19:42:45.712526 4675 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:42:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:42:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:42:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:42:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dc074e08-7429-4b93-941d-3d8335487f37\\\",\\\"systemUUID\\\":\\\"a98cec76-f68c-4af0-9517-b326e9616347\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:45Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:45 crc kubenswrapper[4675]: I0216 19:42:45.718341 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:45 crc kubenswrapper[4675]: I0216 19:42:45.718397 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:45 crc kubenswrapper[4675]: I0216 19:42:45.718412 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:45 crc kubenswrapper[4675]: I0216 19:42:45.718435 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:45 crc kubenswrapper[4675]: I0216 19:42:45.718451 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:45Z","lastTransitionTime":"2026-02-16T19:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:45 crc kubenswrapper[4675]: E0216 19:42:45.733096 4675 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:42:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:42:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:42:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:42:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dc074e08-7429-4b93-941d-3d8335487f37\\\",\\\"systemUUID\\\":\\\"a98cec76-f68c-4af0-9517-b326e9616347\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:45Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:45 crc kubenswrapper[4675]: E0216 19:42:45.733209 4675 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 16 19:42:45 crc kubenswrapper[4675]: I0216 19:42:45.736086 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:45 crc kubenswrapper[4675]: I0216 19:42:45.736165 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:45 crc kubenswrapper[4675]: I0216 19:42:45.736182 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:45 crc kubenswrapper[4675]: I0216 19:42:45.736218 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:45 crc kubenswrapper[4675]: I0216 19:42:45.736234 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:45Z","lastTransitionTime":"2026-02-16T19:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:45 crc kubenswrapper[4675]: I0216 19:42:45.817133 4675 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 16:09:56.058615197 +0000 UTC Feb 16 19:42:45 crc kubenswrapper[4675]: I0216 19:42:45.840465 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:45 crc kubenswrapper[4675]: I0216 19:42:45.840575 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:45 crc kubenswrapper[4675]: I0216 19:42:45.840595 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:45 crc kubenswrapper[4675]: I0216 19:42:45.840619 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:45 crc kubenswrapper[4675]: I0216 19:42:45.840636 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:45Z","lastTransitionTime":"2026-02-16T19:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:45 crc kubenswrapper[4675]: I0216 19:42:45.943410 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:45 crc kubenswrapper[4675]: I0216 19:42:45.943464 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:45 crc kubenswrapper[4675]: I0216 19:42:45.943485 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:45 crc kubenswrapper[4675]: I0216 19:42:45.943510 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:45 crc kubenswrapper[4675]: I0216 19:42:45.943529 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:45Z","lastTransitionTime":"2026-02-16T19:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:46 crc kubenswrapper[4675]: I0216 19:42:46.046888 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:46 crc kubenswrapper[4675]: I0216 19:42:46.046983 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:46 crc kubenswrapper[4675]: I0216 19:42:46.047005 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:46 crc kubenswrapper[4675]: I0216 19:42:46.047034 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:46 crc kubenswrapper[4675]: I0216 19:42:46.047054 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:46Z","lastTransitionTime":"2026-02-16T19:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:46 crc kubenswrapper[4675]: I0216 19:42:46.149661 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:46 crc kubenswrapper[4675]: I0216 19:42:46.149735 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:46 crc kubenswrapper[4675]: I0216 19:42:46.149748 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:46 crc kubenswrapper[4675]: I0216 19:42:46.149771 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:46 crc kubenswrapper[4675]: I0216 19:42:46.149793 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:46Z","lastTransitionTime":"2026-02-16T19:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:46 crc kubenswrapper[4675]: I0216 19:42:46.253160 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:46 crc kubenswrapper[4675]: I0216 19:42:46.253203 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:46 crc kubenswrapper[4675]: I0216 19:42:46.253215 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:46 crc kubenswrapper[4675]: I0216 19:42:46.253233 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:46 crc kubenswrapper[4675]: I0216 19:42:46.253244 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:46Z","lastTransitionTime":"2026-02-16T19:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:46 crc kubenswrapper[4675]: I0216 19:42:46.356326 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:46 crc kubenswrapper[4675]: I0216 19:42:46.356400 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:46 crc kubenswrapper[4675]: I0216 19:42:46.356417 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:46 crc kubenswrapper[4675]: I0216 19:42:46.356433 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:46 crc kubenswrapper[4675]: I0216 19:42:46.356448 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:46Z","lastTransitionTime":"2026-02-16T19:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:46 crc kubenswrapper[4675]: I0216 19:42:46.459848 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:46 crc kubenswrapper[4675]: I0216 19:42:46.459894 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:46 crc kubenswrapper[4675]: I0216 19:42:46.459903 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:46 crc kubenswrapper[4675]: I0216 19:42:46.459928 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:46 crc kubenswrapper[4675]: I0216 19:42:46.459939 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:46Z","lastTransitionTime":"2026-02-16T19:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:46 crc kubenswrapper[4675]: I0216 19:42:46.562727 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:46 crc kubenswrapper[4675]: I0216 19:42:46.562783 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:46 crc kubenswrapper[4675]: I0216 19:42:46.562795 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:46 crc kubenswrapper[4675]: I0216 19:42:46.562814 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:46 crc kubenswrapper[4675]: I0216 19:42:46.562826 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:46Z","lastTransitionTime":"2026-02-16T19:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:46 crc kubenswrapper[4675]: I0216 19:42:46.666540 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:46 crc kubenswrapper[4675]: I0216 19:42:46.666596 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:46 crc kubenswrapper[4675]: I0216 19:42:46.666610 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:46 crc kubenswrapper[4675]: I0216 19:42:46.666630 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:46 crc kubenswrapper[4675]: I0216 19:42:46.666643 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:46Z","lastTransitionTime":"2026-02-16T19:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:46 crc kubenswrapper[4675]: I0216 19:42:46.770181 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:46 crc kubenswrapper[4675]: I0216 19:42:46.770226 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:46 crc kubenswrapper[4675]: I0216 19:42:46.770235 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:46 crc kubenswrapper[4675]: I0216 19:42:46.770256 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:46 crc kubenswrapper[4675]: I0216 19:42:46.770267 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:46Z","lastTransitionTime":"2026-02-16T19:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:46 crc kubenswrapper[4675]: I0216 19:42:46.818141 4675 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 12:43:44.61088925 +0000 UTC Feb 16 19:42:46 crc kubenswrapper[4675]: I0216 19:42:46.874358 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:46 crc kubenswrapper[4675]: I0216 19:42:46.874423 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:46 crc kubenswrapper[4675]: I0216 19:42:46.874442 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:46 crc kubenswrapper[4675]: I0216 19:42:46.874469 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:46 crc kubenswrapper[4675]: I0216 19:42:46.874682 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:46Z","lastTransitionTime":"2026-02-16T19:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:46 crc kubenswrapper[4675]: I0216 19:42:46.883745 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 19:42:46 crc kubenswrapper[4675]: I0216 19:42:46.883830 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 19:42:46 crc kubenswrapper[4675]: I0216 19:42:46.883856 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbgjb" Feb 16 19:42:46 crc kubenswrapper[4675]: I0216 19:42:46.883787 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 19:42:46 crc kubenswrapper[4675]: E0216 19:42:46.883998 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 19:42:46 crc kubenswrapper[4675]: E0216 19:42:46.884110 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 19:42:46 crc kubenswrapper[4675]: E0216 19:42:46.884346 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbgjb" podUID="8d5a5b47-38a4-4f7e-b40e-dba4825e18be" Feb 16 19:42:46 crc kubenswrapper[4675]: E0216 19:42:46.884416 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 19:42:46 crc kubenswrapper[4675]: I0216 19:42:46.978313 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:46 crc kubenswrapper[4675]: I0216 19:42:46.978361 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:46 crc kubenswrapper[4675]: I0216 19:42:46.978373 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:46 crc kubenswrapper[4675]: I0216 19:42:46.978393 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:46 crc kubenswrapper[4675]: I0216 19:42:46.978404 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:46Z","lastTransitionTime":"2026-02-16T19:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:47 crc kubenswrapper[4675]: I0216 19:42:47.081218 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:47 crc kubenswrapper[4675]: I0216 19:42:47.081285 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:47 crc kubenswrapper[4675]: I0216 19:42:47.081304 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:47 crc kubenswrapper[4675]: I0216 19:42:47.081332 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:47 crc kubenswrapper[4675]: I0216 19:42:47.081350 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:47Z","lastTransitionTime":"2026-02-16T19:42:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:47 crc kubenswrapper[4675]: I0216 19:42:47.184031 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:47 crc kubenswrapper[4675]: I0216 19:42:47.184102 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:47 crc kubenswrapper[4675]: I0216 19:42:47.184113 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:47 crc kubenswrapper[4675]: I0216 19:42:47.184138 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:47 crc kubenswrapper[4675]: I0216 19:42:47.184151 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:47Z","lastTransitionTime":"2026-02-16T19:42:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:47 crc kubenswrapper[4675]: I0216 19:42:47.287441 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:47 crc kubenswrapper[4675]: I0216 19:42:47.287505 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:47 crc kubenswrapper[4675]: I0216 19:42:47.287518 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:47 crc kubenswrapper[4675]: I0216 19:42:47.287539 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:47 crc kubenswrapper[4675]: I0216 19:42:47.287551 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:47Z","lastTransitionTime":"2026-02-16T19:42:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:47 crc kubenswrapper[4675]: I0216 19:42:47.390805 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:47 crc kubenswrapper[4675]: I0216 19:42:47.390880 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:47 crc kubenswrapper[4675]: I0216 19:42:47.390899 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:47 crc kubenswrapper[4675]: I0216 19:42:47.390927 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:47 crc kubenswrapper[4675]: I0216 19:42:47.390947 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:47Z","lastTransitionTime":"2026-02-16T19:42:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:47 crc kubenswrapper[4675]: I0216 19:42:47.493651 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:47 crc kubenswrapper[4675]: I0216 19:42:47.493704 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:47 crc kubenswrapper[4675]: I0216 19:42:47.493713 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:47 crc kubenswrapper[4675]: I0216 19:42:47.493730 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:47 crc kubenswrapper[4675]: I0216 19:42:47.493744 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:47Z","lastTransitionTime":"2026-02-16T19:42:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:47 crc kubenswrapper[4675]: I0216 19:42:47.596458 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:47 crc kubenswrapper[4675]: I0216 19:42:47.596525 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:47 crc kubenswrapper[4675]: I0216 19:42:47.596542 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:47 crc kubenswrapper[4675]: I0216 19:42:47.596571 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:47 crc kubenswrapper[4675]: I0216 19:42:47.596589 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:47Z","lastTransitionTime":"2026-02-16T19:42:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:47 crc kubenswrapper[4675]: I0216 19:42:47.706019 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:47 crc kubenswrapper[4675]: I0216 19:42:47.706104 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:47 crc kubenswrapper[4675]: I0216 19:42:47.706124 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:47 crc kubenswrapper[4675]: I0216 19:42:47.706154 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:47 crc kubenswrapper[4675]: I0216 19:42:47.706172 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:47Z","lastTransitionTime":"2026-02-16T19:42:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:47 crc kubenswrapper[4675]: I0216 19:42:47.809375 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:47 crc kubenswrapper[4675]: I0216 19:42:47.809465 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:47 crc kubenswrapper[4675]: I0216 19:42:47.809479 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:47 crc kubenswrapper[4675]: I0216 19:42:47.809500 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:47 crc kubenswrapper[4675]: I0216 19:42:47.809514 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:47Z","lastTransitionTime":"2026-02-16T19:42:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:47 crc kubenswrapper[4675]: I0216 19:42:47.819067 4675 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 22:00:13.879616833 +0000 UTC Feb 16 19:42:47 crc kubenswrapper[4675]: I0216 19:42:47.905591 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:47Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:47 crc kubenswrapper[4675]: I0216 19:42:47.913027 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:47 crc kubenswrapper[4675]: I0216 19:42:47.913115 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:47 crc kubenswrapper[4675]: I0216 19:42:47.913141 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:47 crc kubenswrapper[4675]: I0216 19:42:47.913182 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:47 crc kubenswrapper[4675]: I0216 19:42:47.913208 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:47Z","lastTransitionTime":"2026-02-16T19:42:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:47 crc kubenswrapper[4675]: I0216 19:42:47.926805 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pj5xg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9a99563-d631-455f-8464-160e5619c610\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://798b2a0b186213c744675eb2bd1a33fcb05a8df10795373dbff1f91222fa17a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s94dl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pj5xg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:47Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:47 crc kubenswrapper[4675]: I0216 19:42:47.952175 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://958d71da673d02ed4c067f390a0a94c9f20052f6274ec975f9a4c3034efa7aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5282559b57497e15c0bd828702b4994e8d05f7c1a81ee44e582ce3bb3b64cbb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13af49ba7abbe43edde46130d212f6d76f30bab85a63aef18a84721887e29535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e22795f3990f90b901e1ba026e5f95e72508dc865635d19ba881a291dbaed85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9296772284fdb3e0ac56ffd5e54f10e7fe1256ccb616d52f103479ebcc6e81f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ace6ee814e8e429b6ec77e88a0bb2fff35cfbfd3497e7a4d7882e11f437ce5a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b63e56e590054687fedf2ed3e34d18b7fe5b675ab245cc644b18461752082d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b63e56e590054687fedf2ed3e34d18b7fe5b675ab245cc644b18461752082d9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-16T19:42:34Z\\\",\\\"message\\\":\\\"per:true service.alpha.openshift.io/serving-cert-secret-name:config-operator-serving-cert service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc006f7818f \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 8443 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app: openshift-config-operator,},ClusterIP:10.217.4.161,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.161],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0216 19:42:34.165060 6088 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI0216 19:42:34.165067 6088 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-gpc5z_openshift-ovn-kubernetes(9b6e2d5a-0472-425b-b5b4-0b94f14ebfba)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4a449e69885bb506900d7dbb14ee3ff9eb2d1188267babb04fbe0bb8f057c48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gpc5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:47Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:47 crc kubenswrapper[4675]: I0216 19:42:47.970271 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10414964-83d0-4d95-a89f-e3212a8015b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db3c34dee6cb538e5de05d8703350e3013401589489756aafbf8fed2ef597d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27l4j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92e33c01bc9214841f42d4ca6e58fb062a9e94fe39bdc5ad08ad2351cd270058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27l4j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j7pnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:47Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:47 crc kubenswrapper[4675]: I0216 19:42:47.989257 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc6bac98-79c6-4970-ba8b-7996a4046f7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055f2d1052fed3e5e8d660243c3d0595e3168a670ea7a22ef51b199528ffe826\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beefddcc0b5e2e84c15063f467b9043ba9caea11fbc3f7fb8fc1ece3b8c8f75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dbd2e87a95c7964a1bafb765ec76d7f18e180cc0e47df9fde7b526b3717a107\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cc4f98152f284333d41071adaefada973e3396c3cdb2c6ee3c662f0043fe65b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c762d1facdba3325880ae6b1e2704c626ae67915808b527959795774e2a1ba62\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-16T19:42:02Z\\\",\\\"message\\\":\\\"W0216 19:42:01.228572 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0216 19:42:01.228958 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771270921 cert, and key in /tmp/serving-cert-1169484463/serving-signer.crt, /tmp/serving-cert-1169484463/serving-signer.key\\\\nI0216 19:42:01.683881 1 observer_polling.go:159] Starting file observer\\\\nW0216 19:42:01.687387 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0216 19:42:01.693158 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0216 19:42:01.694799 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1169484463/tls.crt::/tmp/serving-cert-1169484463/tls.key\\\\\\\"\\\\nF0216 19:42:02.048422 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://344f8bd7bee340cc040a2556a0e92c055ed7fbad886f85750b66d21ace766e8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:47Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:48 crc kubenswrapper[4675]: I0216 19:42:48.006056 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18c7d7a1f00611bdd6074314992cb6601ce0c043b831b6125010fe098c7e12ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:48Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:48 crc kubenswrapper[4675]: I0216 19:42:48.016303 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:48 crc kubenswrapper[4675]: I0216 19:42:48.016354 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:48 crc kubenswrapper[4675]: I0216 19:42:48.016367 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:48 crc kubenswrapper[4675]: I0216 19:42:48.016389 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:48 crc kubenswrapper[4675]: I0216 19:42:48.016400 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:48Z","lastTransitionTime":"2026-02-16T19:42:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:48 crc kubenswrapper[4675]: I0216 19:42:48.023928 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:48Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:48 crc kubenswrapper[4675]: I0216 19:42:48.043715 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rg8bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f385df3d-0543-4189-88a6-2163c8b9b959\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f034148fa1b45c35dd6c8b9b70c08b1d745d1377335d7f6d2b35793c29ce9e03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://993eb6f1a848a5279ad2e5d47f47c873918dbca13d2ff7db80bebbad6f948476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://993eb6f1a848a5279ad2e5d47f47c873918dbca13d2ff7db80bebbad6f948476\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02bb51b679cec1e9962a9d52b4fa890f31df7d9c00ed9f4885e94ac52455341c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02bb51b679cec1e9962a9d52b4fa890f31df7d9c00ed9f4885e94ac52455341c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0161d628c911b5e33369708f774f6f8313fb74d7fdc1266b7bfa346d5d6200a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0161d628c911b5e33369708f774f6f8313fb74d7fdc1266b7bfa346d5d6200a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://470da14b61b6a4b60e151726b8a40431c5a38f30416cb9e635e843541e8b28bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://470da14b61b6a4b60e151726b8a40431c5a38f30416cb9e635e843541e8b28bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd74b2f78bf4719b66ec38da6783f0e4ee8b93d27d6c27907212e906c1cd8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbd74b2f78bf4719b66ec38da6783f0e4ee8b93d27d6c27907212e906c1cd8e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f742c6f36d6a537e143339eaa367ff65398f680da2e175aeaed1a4da1886148c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f742c6f36d6a537e143339eaa367ff65398f680da2e175aeaed1a4da1886148c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rg8bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:48Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:48 crc kubenswrapper[4675]: I0216 19:42:48.059541 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4vvwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2de9472b-0e23-4821-86b8-202bfd739aee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2a601bdf26e518854cf3a55f3aed6274701bc40b20fe0f0e7673e65a007f0b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqw54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4vvwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:48Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:48 crc kubenswrapper[4675]: I0216 19:42:48.076285 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"776ad1bc-395c-41c3-8c95-37a58a8cd4e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04dbf4934ecd5c1773fe7c1d032d68d59f41556af022bdf1c36f8de85221616a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abb4dfa515be84263cef6de8c300018193409908292cbfe0d19610ade9532ad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8d3160c906648c15ea910b1b2fa9223c716bbefac62b25bba4f83f3b50217b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd0dd7a8d986a0dd78568e33afa36fcd8bba88cf885d29b22484132bf0034c47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:48Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:48 crc kubenswrapper[4675]: I0216 19:42:48.093419 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04c59a1a3ae53156b955a90113bc3d3b4424b7dcc3baa91e8256fd0893751af8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:48Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:48 crc kubenswrapper[4675]: I0216 19:42:48.112440 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sbgjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d5a5b47-38a4-4f7e-b40e-dba4825e18be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm4b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm4b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sbgjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:48Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:48 crc kubenswrapper[4675]: I0216 19:42:48.118450 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:48 crc kubenswrapper[4675]: I0216 19:42:48.118511 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:48 crc kubenswrapper[4675]: I0216 19:42:48.118525 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:48 crc kubenswrapper[4675]: I0216 19:42:48.118558 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:48 crc kubenswrapper[4675]: I0216 19:42:48.118581 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:48Z","lastTransitionTime":"2026-02-16T19:42:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:48 crc kubenswrapper[4675]: I0216 19:42:48.133162 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10e2a1727464113d2cd88dfb8ba52bb1944b3dfe67a0d40f5b2e57fe483e5890\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83a4afeccd65f013b07c0f90cbb94f765f918aaaeeca716935f940deb85403d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:48Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:48 crc kubenswrapper[4675]: I0216 19:42:48.149351 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-stxk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af599569-feb0-432a-9adf-1b72d1ac1a57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5944f34bb922394197cfb79978d19139f70724115954043828e9f609ca32945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-stxk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:48Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:48 crc kubenswrapper[4675]: I0216 19:42:48.166932 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h9s8p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e98c1ed8-cbed-4d30-b3fc-41beff5d65c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://410f4deab4b31698f80da30412dff846ca9f1004dfab5d4cfe67c6892b3a7aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df2gr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20a2a1cc0b69b60d38a38ab898236d8577ce735cd2e1698f6fd3cb3d3c23b62c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df2gr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-h9s8p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:48Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:48 crc kubenswrapper[4675]: I0216 19:42:48.197556 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"358c3bab-4666-44d1-90fb-3affa22327e2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e0dcd3707a91b9a3869942408c144f0b4ca22cd64e99de5fe43f95094e44a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d3a4d507eb459edf9f2a3c58417ae320eba85a7b597808fdb0eabe7d7d5afbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd404f7efe7c09d47f0550b7cf66ac7193f315f60db74cd5609593546037c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bfda6b7519474256ab63750ba86955592a2e47a0e364e4170a19fe28270459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e9a715621e072260b3d9703159db5eaad4918ff7a2cb7689dcb914ccd617950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e8bfd47afcf3bbd1831b98e69a3ee5fe7b49dd04d4f2efc9d9dc8fda831c74a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e8bfd47afcf3bbd1831b98e69a3ee5fe7b49dd04d4f2efc9d9dc8fda831c74a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65752e9eeab5fea26d84c7d43a33947d3dbabdd31578480904db24d60d298a0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65752e9eeab5fea26d84c7d43a33947d3dbabdd31578480904db24d60d298a0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bd389e534c9b05c81db5cfb0a9f1f5e78e2f1ccbcce15953172836fb3fb6744f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd389e534c9b05c81db5cfb0a9f1f5e78e2f1ccbcce15953172836fb3fb6744f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:48Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:48 crc kubenswrapper[4675]: I0216 19:42:48.217461 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:48Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:48 crc kubenswrapper[4675]: I0216 19:42:48.221569 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:48 crc kubenswrapper[4675]: I0216 19:42:48.221646 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:48 crc kubenswrapper[4675]: I0216 19:42:48.221664 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:48 crc kubenswrapper[4675]: I0216 19:42:48.221715 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:48 crc kubenswrapper[4675]: I0216 19:42:48.221736 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:48Z","lastTransitionTime":"2026-02-16T19:42:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:48 crc kubenswrapper[4675]: I0216 19:42:48.325316 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:48 crc kubenswrapper[4675]: I0216 19:42:48.325383 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:48 crc kubenswrapper[4675]: I0216 19:42:48.325395 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:48 crc kubenswrapper[4675]: I0216 19:42:48.325416 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:48 crc kubenswrapper[4675]: I0216 19:42:48.325428 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:48Z","lastTransitionTime":"2026-02-16T19:42:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:48 crc kubenswrapper[4675]: I0216 19:42:48.428136 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:48 crc kubenswrapper[4675]: I0216 19:42:48.428209 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:48 crc kubenswrapper[4675]: I0216 19:42:48.428222 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:48 crc kubenswrapper[4675]: I0216 19:42:48.428261 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:48 crc kubenswrapper[4675]: I0216 19:42:48.428274 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:48Z","lastTransitionTime":"2026-02-16T19:42:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:48 crc kubenswrapper[4675]: I0216 19:42:48.530996 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:48 crc kubenswrapper[4675]: I0216 19:42:48.531075 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:48 crc kubenswrapper[4675]: I0216 19:42:48.531089 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:48 crc kubenswrapper[4675]: I0216 19:42:48.531112 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:48 crc kubenswrapper[4675]: I0216 19:42:48.531130 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:48Z","lastTransitionTime":"2026-02-16T19:42:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:48 crc kubenswrapper[4675]: I0216 19:42:48.633849 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:48 crc kubenswrapper[4675]: I0216 19:42:48.633883 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:48 crc kubenswrapper[4675]: I0216 19:42:48.633893 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:48 crc kubenswrapper[4675]: I0216 19:42:48.633908 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:48 crc kubenswrapper[4675]: I0216 19:42:48.633918 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:48Z","lastTransitionTime":"2026-02-16T19:42:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:48 crc kubenswrapper[4675]: I0216 19:42:48.736622 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:48 crc kubenswrapper[4675]: I0216 19:42:48.736953 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:48 crc kubenswrapper[4675]: I0216 19:42:48.737046 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:48 crc kubenswrapper[4675]: I0216 19:42:48.737147 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:48 crc kubenswrapper[4675]: I0216 19:42:48.737214 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:48Z","lastTransitionTime":"2026-02-16T19:42:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:48 crc kubenswrapper[4675]: I0216 19:42:48.820152 4675 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-18 08:11:46.791541171 +0000 UTC Feb 16 19:42:48 crc kubenswrapper[4675]: I0216 19:42:48.840462 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:48 crc kubenswrapper[4675]: I0216 19:42:48.840514 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:48 crc kubenswrapper[4675]: I0216 19:42:48.840528 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:48 crc kubenswrapper[4675]: I0216 19:42:48.840546 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:48 crc kubenswrapper[4675]: I0216 19:42:48.840557 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:48Z","lastTransitionTime":"2026-02-16T19:42:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:48 crc kubenswrapper[4675]: I0216 19:42:48.883832 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 19:42:48 crc kubenswrapper[4675]: I0216 19:42:48.883878 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 19:42:48 crc kubenswrapper[4675]: I0216 19:42:48.883915 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbgjb" Feb 16 19:42:48 crc kubenswrapper[4675]: I0216 19:42:48.883872 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 19:42:48 crc kubenswrapper[4675]: E0216 19:42:48.884006 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 19:42:48 crc kubenswrapper[4675]: E0216 19:42:48.884176 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbgjb" podUID="8d5a5b47-38a4-4f7e-b40e-dba4825e18be" Feb 16 19:42:48 crc kubenswrapper[4675]: E0216 19:42:48.884317 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 19:42:48 crc kubenswrapper[4675]: E0216 19:42:48.884445 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 19:42:48 crc kubenswrapper[4675]: I0216 19:42:48.943255 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:48 crc kubenswrapper[4675]: I0216 19:42:48.943318 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:48 crc kubenswrapper[4675]: I0216 19:42:48.943342 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:48 crc kubenswrapper[4675]: I0216 19:42:48.943372 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:48 crc kubenswrapper[4675]: I0216 19:42:48.943387 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:48Z","lastTransitionTime":"2026-02-16T19:42:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:49 crc kubenswrapper[4675]: I0216 19:42:49.046308 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:49 crc kubenswrapper[4675]: I0216 19:42:49.046371 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:49 crc kubenswrapper[4675]: I0216 19:42:49.046387 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:49 crc kubenswrapper[4675]: I0216 19:42:49.046410 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:49 crc kubenswrapper[4675]: I0216 19:42:49.046429 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:49Z","lastTransitionTime":"2026-02-16T19:42:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:49 crc kubenswrapper[4675]: I0216 19:42:49.149600 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:49 crc kubenswrapper[4675]: I0216 19:42:49.149652 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:49 crc kubenswrapper[4675]: I0216 19:42:49.149665 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:49 crc kubenswrapper[4675]: I0216 19:42:49.149704 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:49 crc kubenswrapper[4675]: I0216 19:42:49.149720 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:49Z","lastTransitionTime":"2026-02-16T19:42:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:49 crc kubenswrapper[4675]: I0216 19:42:49.253057 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:49 crc kubenswrapper[4675]: I0216 19:42:49.253139 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:49 crc kubenswrapper[4675]: I0216 19:42:49.253245 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:49 crc kubenswrapper[4675]: I0216 19:42:49.253282 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:49 crc kubenswrapper[4675]: I0216 19:42:49.253306 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:49Z","lastTransitionTime":"2026-02-16T19:42:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:49 crc kubenswrapper[4675]: I0216 19:42:49.357091 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:49 crc kubenswrapper[4675]: I0216 19:42:49.357152 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:49 crc kubenswrapper[4675]: I0216 19:42:49.357168 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:49 crc kubenswrapper[4675]: I0216 19:42:49.357188 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:49 crc kubenswrapper[4675]: I0216 19:42:49.357202 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:49Z","lastTransitionTime":"2026-02-16T19:42:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:49 crc kubenswrapper[4675]: I0216 19:42:49.461404 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:49 crc kubenswrapper[4675]: I0216 19:42:49.461438 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:49 crc kubenswrapper[4675]: I0216 19:42:49.461446 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:49 crc kubenswrapper[4675]: I0216 19:42:49.461462 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:49 crc kubenswrapper[4675]: I0216 19:42:49.461473 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:49Z","lastTransitionTime":"2026-02-16T19:42:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:49 crc kubenswrapper[4675]: I0216 19:42:49.564998 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:49 crc kubenswrapper[4675]: I0216 19:42:49.565425 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:49 crc kubenswrapper[4675]: I0216 19:42:49.565465 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:49 crc kubenswrapper[4675]: I0216 19:42:49.565501 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:49 crc kubenswrapper[4675]: I0216 19:42:49.565525 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:49Z","lastTransitionTime":"2026-02-16T19:42:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:49 crc kubenswrapper[4675]: I0216 19:42:49.668636 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:49 crc kubenswrapper[4675]: I0216 19:42:49.668762 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:49 crc kubenswrapper[4675]: I0216 19:42:49.668788 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:49 crc kubenswrapper[4675]: I0216 19:42:49.668837 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:49 crc kubenswrapper[4675]: I0216 19:42:49.668863 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:49Z","lastTransitionTime":"2026-02-16T19:42:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:49 crc kubenswrapper[4675]: I0216 19:42:49.772477 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:49 crc kubenswrapper[4675]: I0216 19:42:49.772538 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:49 crc kubenswrapper[4675]: I0216 19:42:49.772553 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:49 crc kubenswrapper[4675]: I0216 19:42:49.772573 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:49 crc kubenswrapper[4675]: I0216 19:42:49.772588 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:49Z","lastTransitionTime":"2026-02-16T19:42:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:49 crc kubenswrapper[4675]: I0216 19:42:49.821933 4675 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 03:03:17.90257065 +0000 UTC Feb 16 19:42:49 crc kubenswrapper[4675]: I0216 19:42:49.875632 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:49 crc kubenswrapper[4675]: I0216 19:42:49.875997 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:49 crc kubenswrapper[4675]: I0216 19:42:49.876093 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:49 crc kubenswrapper[4675]: I0216 19:42:49.876201 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:49 crc kubenswrapper[4675]: I0216 19:42:49.876394 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:49Z","lastTransitionTime":"2026-02-16T19:42:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:49 crc kubenswrapper[4675]: I0216 19:42:49.980147 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:49 crc kubenswrapper[4675]: I0216 19:42:49.980192 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:49 crc kubenswrapper[4675]: I0216 19:42:49.980203 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:49 crc kubenswrapper[4675]: I0216 19:42:49.980222 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:49 crc kubenswrapper[4675]: I0216 19:42:49.980233 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:49Z","lastTransitionTime":"2026-02-16T19:42:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:50 crc kubenswrapper[4675]: I0216 19:42:50.083261 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:50 crc kubenswrapper[4675]: I0216 19:42:50.083301 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:50 crc kubenswrapper[4675]: I0216 19:42:50.083310 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:50 crc kubenswrapper[4675]: I0216 19:42:50.083326 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:50 crc kubenswrapper[4675]: I0216 19:42:50.083335 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:50Z","lastTransitionTime":"2026-02-16T19:42:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:50 crc kubenswrapper[4675]: I0216 19:42:50.186940 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:50 crc kubenswrapper[4675]: I0216 19:42:50.187216 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:50 crc kubenswrapper[4675]: I0216 19:42:50.187278 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:50 crc kubenswrapper[4675]: I0216 19:42:50.187360 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:50 crc kubenswrapper[4675]: I0216 19:42:50.187431 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:50Z","lastTransitionTime":"2026-02-16T19:42:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:50 crc kubenswrapper[4675]: I0216 19:42:50.290757 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:50 crc kubenswrapper[4675]: I0216 19:42:50.291149 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:50 crc kubenswrapper[4675]: I0216 19:42:50.291338 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:50 crc kubenswrapper[4675]: I0216 19:42:50.291550 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:50 crc kubenswrapper[4675]: I0216 19:42:50.291852 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:50Z","lastTransitionTime":"2026-02-16T19:42:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:50 crc kubenswrapper[4675]: I0216 19:42:50.394811 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:50 crc kubenswrapper[4675]: I0216 19:42:50.395273 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:50 crc kubenswrapper[4675]: I0216 19:42:50.395507 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:50 crc kubenswrapper[4675]: I0216 19:42:50.395760 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:50 crc kubenswrapper[4675]: I0216 19:42:50.396005 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:50Z","lastTransitionTime":"2026-02-16T19:42:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:50 crc kubenswrapper[4675]: I0216 19:42:50.500063 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:50 crc kubenswrapper[4675]: I0216 19:42:50.500136 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:50 crc kubenswrapper[4675]: I0216 19:42:50.500155 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:50 crc kubenswrapper[4675]: I0216 19:42:50.500183 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:50 crc kubenswrapper[4675]: I0216 19:42:50.500202 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:50Z","lastTransitionTime":"2026-02-16T19:42:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:50 crc kubenswrapper[4675]: I0216 19:42:50.517858 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 19:42:50 crc kubenswrapper[4675]: I0216 19:42:50.517949 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 19:42:50 crc kubenswrapper[4675]: E0216 19:42:50.518078 4675 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 16 19:42:50 crc kubenswrapper[4675]: E0216 19:42:50.518247 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-16 19:43:22.518212344 +0000 UTC m=+85.643501930 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 16 19:42:50 crc kubenswrapper[4675]: E0216 19:42:50.518107 4675 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 16 19:42:50 crc kubenswrapper[4675]: E0216 19:42:50.518341 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-16 19:43:22.518310427 +0000 UTC m=+85.643599993 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 16 19:42:50 crc kubenswrapper[4675]: I0216 19:42:50.605014 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:50 crc kubenswrapper[4675]: I0216 19:42:50.605079 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:50 crc kubenswrapper[4675]: I0216 19:42:50.605096 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:50 crc kubenswrapper[4675]: I0216 19:42:50.605136 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:50 crc kubenswrapper[4675]: I0216 19:42:50.605171 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:50Z","lastTransitionTime":"2026-02-16T19:42:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:50 crc kubenswrapper[4675]: I0216 19:42:50.618997 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 19:42:50 crc kubenswrapper[4675]: I0216 19:42:50.619191 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 19:42:50 crc kubenswrapper[4675]: E0216 19:42:50.619265 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 19:43:22.619223891 +0000 UTC m=+85.744513587 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:42:50 crc kubenswrapper[4675]: E0216 19:42:50.619405 4675 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 16 19:42:50 crc kubenswrapper[4675]: E0216 19:42:50.619437 4675 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 16 19:42:50 crc kubenswrapper[4675]: E0216 19:42:50.619460 4675 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 16 19:42:50 crc kubenswrapper[4675]: I0216 19:42:50.619408 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 19:42:50 crc kubenswrapper[4675]: E0216 19:42:50.619544 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-16 19:43:22.619517289 +0000 UTC m=+85.744806885 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 16 19:42:50 crc kubenswrapper[4675]: E0216 19:42:50.619579 4675 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 16 19:42:50 crc kubenswrapper[4675]: E0216 19:42:50.619717 4675 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 16 19:42:50 crc kubenswrapper[4675]: E0216 19:42:50.619735 4675 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 16 19:42:50 crc kubenswrapper[4675]: E0216 19:42:50.619807 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-16 19:43:22.619772085 +0000 UTC m=+85.745061851 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 16 19:42:50 crc kubenswrapper[4675]: I0216 19:42:50.708108 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:50 crc kubenswrapper[4675]: I0216 19:42:50.708170 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:50 crc kubenswrapper[4675]: I0216 19:42:50.708188 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:50 crc kubenswrapper[4675]: I0216 19:42:50.708218 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:50 crc kubenswrapper[4675]: I0216 19:42:50.708243 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:50Z","lastTransitionTime":"2026-02-16T19:42:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:50 crc kubenswrapper[4675]: I0216 19:42:50.812072 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:50 crc kubenswrapper[4675]: I0216 19:42:50.812147 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:50 crc kubenswrapper[4675]: I0216 19:42:50.812161 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:50 crc kubenswrapper[4675]: I0216 19:42:50.812181 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:50 crc kubenswrapper[4675]: I0216 19:42:50.812886 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:50Z","lastTransitionTime":"2026-02-16T19:42:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:50 crc kubenswrapper[4675]: I0216 19:42:50.822640 4675 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 22:15:25.024856126 +0000 UTC Feb 16 19:42:50 crc kubenswrapper[4675]: I0216 19:42:50.883536 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 19:42:50 crc kubenswrapper[4675]: I0216 19:42:50.883717 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 19:42:50 crc kubenswrapper[4675]: E0216 19:42:50.883820 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 19:42:50 crc kubenswrapper[4675]: I0216 19:42:50.883949 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 19:42:50 crc kubenswrapper[4675]: I0216 19:42:50.883943 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbgjb" Feb 16 19:42:50 crc kubenswrapper[4675]: E0216 19:42:50.884133 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 19:42:50 crc kubenswrapper[4675]: E0216 19:42:50.884180 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 19:42:50 crc kubenswrapper[4675]: E0216 19:42:50.884246 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbgjb" podUID="8d5a5b47-38a4-4f7e-b40e-dba4825e18be" Feb 16 19:42:50 crc kubenswrapper[4675]: I0216 19:42:50.916090 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:50 crc kubenswrapper[4675]: I0216 19:42:50.916148 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:50 crc kubenswrapper[4675]: I0216 19:42:50.916166 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:50 crc kubenswrapper[4675]: I0216 19:42:50.916191 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:50 crc kubenswrapper[4675]: I0216 19:42:50.916206 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:50Z","lastTransitionTime":"2026-02-16T19:42:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:51 crc kubenswrapper[4675]: I0216 19:42:51.019044 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:51 crc kubenswrapper[4675]: I0216 19:42:51.019090 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:51 crc kubenswrapper[4675]: I0216 19:42:51.019106 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:51 crc kubenswrapper[4675]: I0216 19:42:51.019128 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:51 crc kubenswrapper[4675]: I0216 19:42:51.019146 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:51Z","lastTransitionTime":"2026-02-16T19:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:51 crc kubenswrapper[4675]: I0216 19:42:51.024023 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8d5a5b47-38a4-4f7e-b40e-dba4825e18be-metrics-certs\") pod \"network-metrics-daemon-sbgjb\" (UID: \"8d5a5b47-38a4-4f7e-b40e-dba4825e18be\") " pod="openshift-multus/network-metrics-daemon-sbgjb" Feb 16 19:42:51 crc kubenswrapper[4675]: E0216 19:42:51.024171 4675 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 16 19:42:51 crc kubenswrapper[4675]: E0216 19:42:51.024239 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8d5a5b47-38a4-4f7e-b40e-dba4825e18be-metrics-certs podName:8d5a5b47-38a4-4f7e-b40e-dba4825e18be nodeName:}" failed. No retries permitted until 2026-02-16 19:43:07.024221582 +0000 UTC m=+70.149511148 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8d5a5b47-38a4-4f7e-b40e-dba4825e18be-metrics-certs") pod "network-metrics-daemon-sbgjb" (UID: "8d5a5b47-38a4-4f7e-b40e-dba4825e18be") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 16 19:42:51 crc kubenswrapper[4675]: I0216 19:42:51.123397 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:51 crc kubenswrapper[4675]: I0216 19:42:51.123508 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:51 crc kubenswrapper[4675]: I0216 19:42:51.123532 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:51 crc kubenswrapper[4675]: I0216 19:42:51.123559 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:51 crc kubenswrapper[4675]: I0216 19:42:51.123578 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:51Z","lastTransitionTime":"2026-02-16T19:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:51 crc kubenswrapper[4675]: I0216 19:42:51.227334 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:51 crc kubenswrapper[4675]: I0216 19:42:51.227380 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:51 crc kubenswrapper[4675]: I0216 19:42:51.227390 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:51 crc kubenswrapper[4675]: I0216 19:42:51.227412 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:51 crc kubenswrapper[4675]: I0216 19:42:51.227425 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:51Z","lastTransitionTime":"2026-02-16T19:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:51 crc kubenswrapper[4675]: I0216 19:42:51.330571 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:51 crc kubenswrapper[4675]: I0216 19:42:51.330647 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:51 crc kubenswrapper[4675]: I0216 19:42:51.330667 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:51 crc kubenswrapper[4675]: I0216 19:42:51.330738 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:51 crc kubenswrapper[4675]: I0216 19:42:51.330764 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:51Z","lastTransitionTime":"2026-02-16T19:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:51 crc kubenswrapper[4675]: I0216 19:42:51.434915 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:51 crc kubenswrapper[4675]: I0216 19:42:51.434969 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:51 crc kubenswrapper[4675]: I0216 19:42:51.434981 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:51 crc kubenswrapper[4675]: I0216 19:42:51.434999 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:51 crc kubenswrapper[4675]: I0216 19:42:51.435010 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:51Z","lastTransitionTime":"2026-02-16T19:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:51 crc kubenswrapper[4675]: I0216 19:42:51.539764 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:51 crc kubenswrapper[4675]: I0216 19:42:51.539874 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:51 crc kubenswrapper[4675]: I0216 19:42:51.539890 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:51 crc kubenswrapper[4675]: I0216 19:42:51.539918 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:51 crc kubenswrapper[4675]: I0216 19:42:51.539955 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:51Z","lastTransitionTime":"2026-02-16T19:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:51 crc kubenswrapper[4675]: I0216 19:42:51.643862 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:51 crc kubenswrapper[4675]: I0216 19:42:51.643988 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:51 crc kubenswrapper[4675]: I0216 19:42:51.644159 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:51 crc kubenswrapper[4675]: I0216 19:42:51.644295 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:51 crc kubenswrapper[4675]: I0216 19:42:51.644317 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:51Z","lastTransitionTime":"2026-02-16T19:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:51 crc kubenswrapper[4675]: I0216 19:42:51.747315 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:51 crc kubenswrapper[4675]: I0216 19:42:51.747361 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:51 crc kubenswrapper[4675]: I0216 19:42:51.747374 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:51 crc kubenswrapper[4675]: I0216 19:42:51.747392 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:51 crc kubenswrapper[4675]: I0216 19:42:51.747403 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:51Z","lastTransitionTime":"2026-02-16T19:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:51 crc kubenswrapper[4675]: I0216 19:42:51.823005 4675 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 10:55:58.165100922 +0000 UTC Feb 16 19:42:51 crc kubenswrapper[4675]: I0216 19:42:51.849863 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:51 crc kubenswrapper[4675]: I0216 19:42:51.849935 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:51 crc kubenswrapper[4675]: I0216 19:42:51.849945 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:51 crc kubenswrapper[4675]: I0216 19:42:51.849963 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:51 crc kubenswrapper[4675]: I0216 19:42:51.849975 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:51Z","lastTransitionTime":"2026-02-16T19:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:51 crc kubenswrapper[4675]: I0216 19:42:51.953163 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:51 crc kubenswrapper[4675]: I0216 19:42:51.953229 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:51 crc kubenswrapper[4675]: I0216 19:42:51.953241 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:51 crc kubenswrapper[4675]: I0216 19:42:51.953264 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:51 crc kubenswrapper[4675]: I0216 19:42:51.953276 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:51Z","lastTransitionTime":"2026-02-16T19:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:52 crc kubenswrapper[4675]: I0216 19:42:52.056171 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:52 crc kubenswrapper[4675]: I0216 19:42:52.056237 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:52 crc kubenswrapper[4675]: I0216 19:42:52.056256 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:52 crc kubenswrapper[4675]: I0216 19:42:52.056285 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:52 crc kubenswrapper[4675]: I0216 19:42:52.056308 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:52Z","lastTransitionTime":"2026-02-16T19:42:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:52 crc kubenswrapper[4675]: I0216 19:42:52.159913 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:52 crc kubenswrapper[4675]: I0216 19:42:52.159981 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:52 crc kubenswrapper[4675]: I0216 19:42:52.159998 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:52 crc kubenswrapper[4675]: I0216 19:42:52.160025 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:52 crc kubenswrapper[4675]: I0216 19:42:52.160042 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:52Z","lastTransitionTime":"2026-02-16T19:42:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:52 crc kubenswrapper[4675]: I0216 19:42:52.263330 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:52 crc kubenswrapper[4675]: I0216 19:42:52.263415 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:52 crc kubenswrapper[4675]: I0216 19:42:52.263434 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:52 crc kubenswrapper[4675]: I0216 19:42:52.263460 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:52 crc kubenswrapper[4675]: I0216 19:42:52.263479 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:52Z","lastTransitionTime":"2026-02-16T19:42:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:52 crc kubenswrapper[4675]: I0216 19:42:52.366680 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:52 crc kubenswrapper[4675]: I0216 19:42:52.366764 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:52 crc kubenswrapper[4675]: I0216 19:42:52.366778 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:52 crc kubenswrapper[4675]: I0216 19:42:52.366799 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:52 crc kubenswrapper[4675]: I0216 19:42:52.366812 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:52Z","lastTransitionTime":"2026-02-16T19:42:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:52 crc kubenswrapper[4675]: I0216 19:42:52.469783 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:52 crc kubenswrapper[4675]: I0216 19:42:52.469844 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:52 crc kubenswrapper[4675]: I0216 19:42:52.469855 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:52 crc kubenswrapper[4675]: I0216 19:42:52.469874 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:52 crc kubenswrapper[4675]: I0216 19:42:52.469887 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:52Z","lastTransitionTime":"2026-02-16T19:42:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:52 crc kubenswrapper[4675]: I0216 19:42:52.498070 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 16 19:42:52 crc kubenswrapper[4675]: I0216 19:42:52.513435 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Feb 16 19:42:52 crc kubenswrapper[4675]: I0216 19:42:52.523061 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"358c3bab-4666-44d1-90fb-3affa22327e2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e0dcd3707a91b9a3869942408c144f0b4ca22cd64e99de5fe43f95094e44a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d3a4d507eb459edf9f2a3c58417ae320eba85a7b597808fdb0eabe7d7d5afbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd404f7efe7c09d47f0550b7cf66ac7193f315f60db74cd5609593546037c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bfda6b7519474256ab63750ba86955592a2e47a0e364e4170a19fe28270459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e9a715621e072260b3d9703159db5eaad4918ff7a2cb7689dcb914ccd617950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e8bfd47afcf3bbd1831b98e69a3ee5fe7b49dd04d4f2efc9d9dc8fda831c74a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e8bfd47afcf3bbd1831b98e69a3ee5fe7b49dd04d4f2efc9d9dc8fda831c74a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65752e9eeab5fea26d84c7d43a33947d3dbabdd31578480904db24d60d298a0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65752e9eeab5fea26d84c7d43a33947d3dbabdd31578480904db24d60d298a0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bd389e534c9b05c81db5cfb0a9f1f5e78e2f1ccbcce15953172836fb3fb6744f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd389e534c9b05c81db5cfb0a9f1f5e78e2f1ccbcce15953172836fb3fb6744f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:52Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:52 crc kubenswrapper[4675]: I0216 19:42:52.537515 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:52Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:52 crc kubenswrapper[4675]: I0216 19:42:52.558760 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://958d71da673d02ed4c067f390a0a94c9f20052f6274ec975f9a4c3034efa7aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5282559b57497e15c0bd828702b4994e8d05f7c1a81ee44e582ce3bb3b64cbb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13af49ba7abbe43edde46130d212f6d76f30bab85a63aef18a84721887e29535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e22795f3990f90b901e1ba026e5f95e72508dc865635d19ba881a291dbaed85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9296772284fdb3e0ac56ffd5e54f10e7fe1256ccb616d52f103479ebcc6e81f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ace6ee814e8e429b6ec77e88a0bb2fff35cfbfd3497e7a4d7882e11f437ce5a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b63e56e590054687fedf2ed3e34d18b7fe5b675ab245cc644b18461752082d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b63e56e590054687fedf2ed3e34d18b7fe5b675ab245cc644b18461752082d9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-16T19:42:34Z\\\",\\\"message\\\":\\\"per:true service.alpha.openshift.io/serving-cert-secret-name:config-operator-serving-cert service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc006f7818f \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 8443 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app: openshift-config-operator,},ClusterIP:10.217.4.161,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.161],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0216 19:42:34.165060 6088 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI0216 19:42:34.165067 6088 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-gpc5z_openshift-ovn-kubernetes(9b6e2d5a-0472-425b-b5b4-0b94f14ebfba)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4a449e69885bb506900d7dbb14ee3ff9eb2d1188267babb04fbe0bb8f057c48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gpc5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:52Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:52 crc kubenswrapper[4675]: I0216 19:42:52.572921 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:52 crc kubenswrapper[4675]: I0216 19:42:52.572997 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:52 crc kubenswrapper[4675]: I0216 19:42:52.573007 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:52 crc kubenswrapper[4675]: I0216 19:42:52.573027 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:52 crc kubenswrapper[4675]: I0216 19:42:52.573043 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:52Z","lastTransitionTime":"2026-02-16T19:42:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:52 crc kubenswrapper[4675]: I0216 19:42:52.579609 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10414964-83d0-4d95-a89f-e3212a8015b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db3c34dee6cb538e5de05d8703350e3013401589489756aafbf8fed2ef597d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27l4j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92e33c01bc9214841f42d4ca6e58fb062a9e94fe39bdc5ad08ad2351cd270058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27l4j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j7pnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:52Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:52 crc kubenswrapper[4675]: I0216 19:42:52.597296 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc6bac98-79c6-4970-ba8b-7996a4046f7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055f2d1052fed3e5e8d660243c3d0595e3168a670ea7a22ef51b199528ffe826\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beefddcc0b5e2e84c15063f467b9043ba9caea11fbc3f7fb8fc1ece3b8c8f75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dbd2e87a95c7964a1bafb765ec76d7f18e180cc0e47df9fde7b526b3717a107\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cc4f98152f284333d41071adaefada973e3396c3cdb2c6ee3c662f0043fe65b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c762d1facdba3325880ae6b1e2704c626ae67915808b527959795774e2a1ba62\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-16T19:42:02Z\\\",\\\"message\\\":\\\"W0216 19:42:01.228572 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0216 19:42:01.228958 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771270921 cert, and key in /tmp/serving-cert-1169484463/serving-signer.crt, /tmp/serving-cert-1169484463/serving-signer.key\\\\nI0216 19:42:01.683881 1 observer_polling.go:159] Starting file observer\\\\nW0216 19:42:01.687387 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0216 19:42:01.693158 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0216 19:42:01.694799 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1169484463/tls.crt::/tmp/serving-cert-1169484463/tls.key\\\\\\\"\\\\nF0216 19:42:02.048422 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://344f8bd7bee340cc040a2556a0e92c055ed7fbad886f85750b66d21ace766e8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:52Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:52 crc kubenswrapper[4675]: I0216 19:42:52.614996 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18c7d7a1f00611bdd6074314992cb6601ce0c043b831b6125010fe098c7e12ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:52Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:52 crc kubenswrapper[4675]: I0216 19:42:52.633300 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:52Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:52 crc kubenswrapper[4675]: I0216 19:42:52.649277 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:52Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:52 crc kubenswrapper[4675]: I0216 19:42:52.667096 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pj5xg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9a99563-d631-455f-8464-160e5619c610\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://798b2a0b186213c744675eb2bd1a33fcb05a8df10795373dbff1f91222fa17a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s94dl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pj5xg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:52Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:52 crc kubenswrapper[4675]: I0216 19:42:52.676264 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:52 crc kubenswrapper[4675]: I0216 19:42:52.676333 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:52 crc kubenswrapper[4675]: I0216 19:42:52.676351 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:52 crc kubenswrapper[4675]: I0216 19:42:52.676379 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:52 crc kubenswrapper[4675]: I0216 19:42:52.676399 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:52Z","lastTransitionTime":"2026-02-16T19:42:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:52 crc kubenswrapper[4675]: I0216 19:42:52.691123 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rg8bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f385df3d-0543-4189-88a6-2163c8b9b959\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f034148fa1b45c35dd6c8b9b70c08b1d745d1377335d7f6d2b35793c29ce9e03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://993eb6f1a848a5279ad2e5d47f47c873918dbca13d2ff7db80bebbad6f948476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://993eb6f1a848a5279ad2e5d47f47c873918dbca13d2ff7db80bebbad6f948476\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02bb51b679cec1e9962a9d52b4fa890f31df7d9c00ed9f4885e94ac52455341c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02bb51b679cec1e9962a9d52b4fa890f31df7d9c00ed9f4885e94ac52455341c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0161d628c911b5e33369708f774f6f8313fb74d7fdc1266b7bfa346d5d6200a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0161d628c911b5e33369708f774f6f8313fb74d7fdc1266b7bfa346d5d6200a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://470da14b61b6a4b60e151726b8a40431c5a38f30416cb9e635e843541e8b28bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://470da14b61b6a4b60e151726b8a40431c5a38f30416cb9e635e843541e8b28bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd74b2f78bf4719b66ec38da6783f0e4ee8b93d27d6c27907212e906c1cd8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbd74b2f78bf4719b66ec38da6783f0e4ee8b93d27d6c27907212e906c1cd8e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f742c6f36d6a537e143339eaa367ff65398f680da2e175aeaed1a4da1886148c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f742c6f36d6a537e143339eaa367ff65398f680da2e175aeaed1a4da1886148c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rg8bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:52Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:52 crc kubenswrapper[4675]: I0216 19:42:52.705600 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4vvwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2de9472b-0e23-4821-86b8-202bfd739aee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2a601bdf26e518854cf3a55f3aed6274701bc40b20fe0f0e7673e65a007f0b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqw54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4vvwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:52Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:52 crc kubenswrapper[4675]: I0216 19:42:52.720194 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"776ad1bc-395c-41c3-8c95-37a58a8cd4e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04dbf4934ecd5c1773fe7c1d032d68d59f41556af022bdf1c36f8de85221616a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abb4dfa515be84263cef6de8c300018193409908292cbfe0d19610ade9532ad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8d3160c906648c15ea910b1b2fa9223c716bbefac62b25bba4f83f3b50217b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd0dd7a8d986a0dd78568e33afa36fcd8bba88cf885d29b22484132bf0034c47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:52Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:52 crc kubenswrapper[4675]: I0216 19:42:52.737163 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04c59a1a3ae53156b955a90113bc3d3b4424b7dcc3baa91e8256fd0893751af8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:52Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:52 crc kubenswrapper[4675]: I0216 19:42:52.751555 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sbgjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d5a5b47-38a4-4f7e-b40e-dba4825e18be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm4b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm4b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sbgjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:52Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:52 crc kubenswrapper[4675]: I0216 19:42:52.767774 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10e2a1727464113d2cd88dfb8ba52bb1944b3dfe67a0d40f5b2e57fe483e5890\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83a4afeccd65f013b07c0f90cbb94f765f918aaaeeca716935f940deb85403d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:52Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:52 crc kubenswrapper[4675]: I0216 19:42:52.780107 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:52 crc kubenswrapper[4675]: I0216 19:42:52.780194 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:52 crc kubenswrapper[4675]: I0216 19:42:52.780217 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:52 crc kubenswrapper[4675]: I0216 19:42:52.780249 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:52 crc kubenswrapper[4675]: I0216 19:42:52.780270 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:52Z","lastTransitionTime":"2026-02-16T19:42:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:52 crc kubenswrapper[4675]: I0216 19:42:52.782352 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-stxk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af599569-feb0-432a-9adf-1b72d1ac1a57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5944f34bb922394197cfb79978d19139f70724115954043828e9f609ca32945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-stxk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:52Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:52 crc kubenswrapper[4675]: I0216 19:42:52.796585 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h9s8p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e98c1ed8-cbed-4d30-b3fc-41beff5d65c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://410f4deab4b31698f80da30412dff846ca9f1004dfab5d4cfe67c6892b3a7aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df2gr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20a2a1cc0b69b60d38a38ab898236d8577ce735cd2e1698f6fd3cb3d3c23b62c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df2gr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-h9s8p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:52Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:52 crc kubenswrapper[4675]: I0216 19:42:52.823992 4675 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 14:22:03.983816097 +0000 UTC Feb 16 19:42:52 crc kubenswrapper[4675]: I0216 19:42:52.883193 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbgjb" Feb 16 19:42:52 crc kubenswrapper[4675]: I0216 19:42:52.883272 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 19:42:52 crc kubenswrapper[4675]: E0216 19:42:52.883362 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbgjb" podUID="8d5a5b47-38a4-4f7e-b40e-dba4825e18be" Feb 16 19:42:52 crc kubenswrapper[4675]: I0216 19:42:52.883315 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 19:42:52 crc kubenswrapper[4675]: I0216 19:42:52.883482 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:52 crc kubenswrapper[4675]: E0216 19:42:52.883515 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 19:42:52 crc kubenswrapper[4675]: I0216 19:42:52.883529 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:52 crc kubenswrapper[4675]: E0216 19:42:52.883561 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 19:42:52 crc kubenswrapper[4675]: I0216 19:42:52.883590 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:52 crc kubenswrapper[4675]: I0216 19:42:52.883620 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:52 crc kubenswrapper[4675]: I0216 19:42:52.883283 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 19:42:52 crc kubenswrapper[4675]: I0216 19:42:52.883636 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:52Z","lastTransitionTime":"2026-02-16T19:42:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:52 crc kubenswrapper[4675]: E0216 19:42:52.883829 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 19:42:52 crc kubenswrapper[4675]: I0216 19:42:52.986854 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:52 crc kubenswrapper[4675]: I0216 19:42:52.986914 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:52 crc kubenswrapper[4675]: I0216 19:42:52.986924 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:52 crc kubenswrapper[4675]: I0216 19:42:52.986943 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:52 crc kubenswrapper[4675]: I0216 19:42:52.986955 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:52Z","lastTransitionTime":"2026-02-16T19:42:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:53 crc kubenswrapper[4675]: I0216 19:42:53.090466 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:53 crc kubenswrapper[4675]: I0216 19:42:53.090538 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:53 crc kubenswrapper[4675]: I0216 19:42:53.090559 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:53 crc kubenswrapper[4675]: I0216 19:42:53.090586 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:53 crc kubenswrapper[4675]: I0216 19:42:53.090606 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:53Z","lastTransitionTime":"2026-02-16T19:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:53 crc kubenswrapper[4675]: I0216 19:42:53.194467 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:53 crc kubenswrapper[4675]: I0216 19:42:53.194549 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:53 crc kubenswrapper[4675]: I0216 19:42:53.194567 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:53 crc kubenswrapper[4675]: I0216 19:42:53.194595 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:53 crc kubenswrapper[4675]: I0216 19:42:53.194617 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:53Z","lastTransitionTime":"2026-02-16T19:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:53 crc kubenswrapper[4675]: I0216 19:42:53.296814 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:53 crc kubenswrapper[4675]: I0216 19:42:53.296886 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:53 crc kubenswrapper[4675]: I0216 19:42:53.296909 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:53 crc kubenswrapper[4675]: I0216 19:42:53.296940 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:53 crc kubenswrapper[4675]: I0216 19:42:53.296964 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:53Z","lastTransitionTime":"2026-02-16T19:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:53 crc kubenswrapper[4675]: I0216 19:42:53.400638 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:53 crc kubenswrapper[4675]: I0216 19:42:53.400726 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:53 crc kubenswrapper[4675]: I0216 19:42:53.400740 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:53 crc kubenswrapper[4675]: I0216 19:42:53.400759 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:53 crc kubenswrapper[4675]: I0216 19:42:53.400772 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:53Z","lastTransitionTime":"2026-02-16T19:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:53 crc kubenswrapper[4675]: I0216 19:42:53.504303 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:53 crc kubenswrapper[4675]: I0216 19:42:53.504389 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:53 crc kubenswrapper[4675]: I0216 19:42:53.504403 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:53 crc kubenswrapper[4675]: I0216 19:42:53.504428 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:53 crc kubenswrapper[4675]: I0216 19:42:53.504443 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:53Z","lastTransitionTime":"2026-02-16T19:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:53 crc kubenswrapper[4675]: I0216 19:42:53.608019 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:53 crc kubenswrapper[4675]: I0216 19:42:53.608093 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:53 crc kubenswrapper[4675]: I0216 19:42:53.608117 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:53 crc kubenswrapper[4675]: I0216 19:42:53.608150 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:53 crc kubenswrapper[4675]: I0216 19:42:53.608176 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:53Z","lastTransitionTime":"2026-02-16T19:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:53 crc kubenswrapper[4675]: I0216 19:42:53.711964 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:53 crc kubenswrapper[4675]: I0216 19:42:53.712043 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:53 crc kubenswrapper[4675]: I0216 19:42:53.712062 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:53 crc kubenswrapper[4675]: I0216 19:42:53.712090 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:53 crc kubenswrapper[4675]: I0216 19:42:53.712118 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:53Z","lastTransitionTime":"2026-02-16T19:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:53 crc kubenswrapper[4675]: I0216 19:42:53.815438 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:53 crc kubenswrapper[4675]: I0216 19:42:53.815522 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:53 crc kubenswrapper[4675]: I0216 19:42:53.815540 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:53 crc kubenswrapper[4675]: I0216 19:42:53.815571 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:53 crc kubenswrapper[4675]: I0216 19:42:53.815591 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:53Z","lastTransitionTime":"2026-02-16T19:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:53 crc kubenswrapper[4675]: I0216 19:42:53.825058 4675 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 23:52:46.542921647 +0000 UTC Feb 16 19:42:53 crc kubenswrapper[4675]: I0216 19:42:53.884946 4675 scope.go:117] "RemoveContainer" containerID="2b63e56e590054687fedf2ed3e34d18b7fe5b675ab245cc644b18461752082d9" Feb 16 19:42:53 crc kubenswrapper[4675]: I0216 19:42:53.919724 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:53 crc kubenswrapper[4675]: I0216 19:42:53.919782 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:53 crc kubenswrapper[4675]: I0216 19:42:53.919807 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:53 crc kubenswrapper[4675]: I0216 19:42:53.919842 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:53 crc kubenswrapper[4675]: I0216 19:42:53.919870 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:53Z","lastTransitionTime":"2026-02-16T19:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:54 crc kubenswrapper[4675]: I0216 19:42:54.022032 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:54 crc kubenswrapper[4675]: I0216 19:42:54.022586 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:54 crc kubenswrapper[4675]: I0216 19:42:54.022602 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:54 crc kubenswrapper[4675]: I0216 19:42:54.022633 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:54 crc kubenswrapper[4675]: I0216 19:42:54.022655 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:54Z","lastTransitionTime":"2026-02-16T19:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:54 crc kubenswrapper[4675]: I0216 19:42:54.125423 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:54 crc kubenswrapper[4675]: I0216 19:42:54.125471 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:54 crc kubenswrapper[4675]: I0216 19:42:54.125484 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:54 crc kubenswrapper[4675]: I0216 19:42:54.125505 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:54 crc kubenswrapper[4675]: I0216 19:42:54.125517 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:54Z","lastTransitionTime":"2026-02-16T19:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:54 crc kubenswrapper[4675]: I0216 19:42:54.228583 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:54 crc kubenswrapper[4675]: I0216 19:42:54.228652 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:54 crc kubenswrapper[4675]: I0216 19:42:54.228673 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:54 crc kubenswrapper[4675]: I0216 19:42:54.228725 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:54 crc kubenswrapper[4675]: I0216 19:42:54.228745 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:54Z","lastTransitionTime":"2026-02-16T19:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:54 crc kubenswrapper[4675]: I0216 19:42:54.285153 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gpc5z_9b6e2d5a-0472-425b-b5b4-0b94f14ebfba/ovnkube-controller/1.log" Feb 16 19:42:54 crc kubenswrapper[4675]: I0216 19:42:54.289996 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" event={"ID":"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba","Type":"ContainerStarted","Data":"5452aa74eda5605a3b5ad24d32c48b20ec9f2c479d5dd388e4307b79245e8f4b"} Feb 16 19:42:54 crc kubenswrapper[4675]: I0216 19:42:54.290738 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" Feb 16 19:42:54 crc kubenswrapper[4675]: I0216 19:42:54.310947 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10e2a1727464113d2cd88dfb8ba52bb1944b3dfe67a0d40f5b2e57fe483e5890\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83a4afeccd65f013b07c0f90cbb94f765f918aaaeeca716935f940deb85403d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:54Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:54 crc kubenswrapper[4675]: I0216 19:42:54.327386 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-stxk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af599569-feb0-432a-9adf-1b72d1ac1a57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5944f34bb922394197cfb79978d19139f70724115954043828e9f609ca32945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-stxk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:54Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:54 crc kubenswrapper[4675]: I0216 19:42:54.332801 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:54 crc kubenswrapper[4675]: I0216 19:42:54.332859 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:54 crc kubenswrapper[4675]: I0216 19:42:54.332871 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:54 crc kubenswrapper[4675]: I0216 19:42:54.332892 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:54 crc kubenswrapper[4675]: I0216 19:42:54.332907 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:54Z","lastTransitionTime":"2026-02-16T19:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:54 crc kubenswrapper[4675]: I0216 19:42:54.344150 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h9s8p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e98c1ed8-cbed-4d30-b3fc-41beff5d65c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://410f4deab4b31698f80da30412dff846ca9f1004dfab5d4cfe67c6892b3a7aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df2gr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20a2a1cc0b69b60d38a38ab898236d8577ce735cd2e1698f6fd3cb3d3c23b62c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df2gr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-h9s8p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:54Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:54 crc kubenswrapper[4675]: I0216 19:42:54.377923 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"358c3bab-4666-44d1-90fb-3affa22327e2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e0dcd3707a91b9a3869942408c144f0b4ca22cd64e99de5fe43f95094e44a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d3a4d507eb459edf9f2a3c58417ae320eba85a7b597808fdb0eabe7d7d5afbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd404f7efe7c09d47f0550b7cf66ac7193f315f60db74cd5609593546037c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bfda6b7519474256ab63750ba86955592a2e47a0e364e4170a19fe28270459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e9a715621e072260b3d9703159db5eaad4918ff7a2cb7689dcb914ccd617950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e8bfd47afcf3bbd1831b98e69a3ee5fe7b49dd04d4f2efc9d9dc8fda831c74a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e8bfd47afcf3bbd1831b98e69a3ee5fe7b49dd04d4f2efc9d9dc8fda831c74a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65752e9eeab5fea26d84c7d43a33947d3dbabdd31578480904db24d60d298a0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65752e9eeab5fea26d84c7d43a33947d3dbabdd31578480904db24d60d298a0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bd389e534c9b05c81db5cfb0a9f1f5e78e2f1ccbcce15953172836fb3fb6744f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd389e534c9b05c81db5cfb0a9f1f5e78e2f1ccbcce15953172836fb3fb6744f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:54Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:54 crc kubenswrapper[4675]: I0216 19:42:54.399482 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:54Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:54 crc kubenswrapper[4675]: I0216 19:42:54.425880 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://958d71da673d02ed4c067f390a0a94c9f20052f6274ec975f9a4c3034efa7aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5282559b57497e15c0bd828702b4994e8d05f7c1a81ee44e582ce3bb3b64cbb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13af49ba7abbe43edde46130d212f6d76f30bab85a63aef18a84721887e29535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e22795f3990f90b901e1ba026e5f95e72508dc865635d19ba881a291dbaed85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9296772284fdb3e0ac56ffd5e54f10e7fe1256ccb616d52f103479ebcc6e81f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ace6ee814e8e429b6ec77e88a0bb2fff35cfbfd3497e7a4d7882e11f437ce5a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5452aa74eda5605a3b5ad24d32c48b20ec9f2c479d5dd388e4307b79245e8f4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b63e56e590054687fedf2ed3e34d18b7fe5b675ab245cc644b18461752082d9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-16T19:42:34Z\\\",\\\"message\\\":\\\"per:true service.alpha.openshift.io/serving-cert-secret-name:config-operator-serving-cert service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc006f7818f \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 8443 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app: openshift-config-operator,},ClusterIP:10.217.4.161,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.161],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0216 19:42:34.165060 6088 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI0216 19:42:34.165067 6088 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4a449e69885bb506900d7dbb14ee3ff9eb2d1188267babb04fbe0bb8f057c48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gpc5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:54Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:54 crc kubenswrapper[4675]: I0216 19:42:54.435558 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:54 crc kubenswrapper[4675]: I0216 19:42:54.435630 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:54 crc kubenswrapper[4675]: I0216 19:42:54.435649 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:54 crc kubenswrapper[4675]: I0216 19:42:54.435679 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:54 crc kubenswrapper[4675]: I0216 19:42:54.435726 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:54Z","lastTransitionTime":"2026-02-16T19:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:54 crc kubenswrapper[4675]: I0216 19:42:54.444417 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10414964-83d0-4d95-a89f-e3212a8015b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db3c34dee6cb538e5de05d8703350e3013401589489756aafbf8fed2ef597d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27l4j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92e33c01bc9214841f42d4ca6e58fb062a9e94fe39bdc5ad08ad2351cd270058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27l4j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j7pnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:54Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:54 crc kubenswrapper[4675]: I0216 19:42:54.466581 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc6bac98-79c6-4970-ba8b-7996a4046f7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055f2d1052fed3e5e8d660243c3d0595e3168a670ea7a22ef51b199528ffe826\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beefddcc0b5e2e84c15063f467b9043ba9caea11fbc3f7fb8fc1ece3b8c8f75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dbd2e87a95c7964a1bafb765ec76d7f18e180cc0e47df9fde7b526b3717a107\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cc4f98152f284333d41071adaefada973e3396c3cdb2c6ee3c662f0043fe65b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c762d1facdba3325880ae6b1e2704c626ae67915808b527959795774e2a1ba62\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-16T19:42:02Z\\\",\\\"message\\\":\\\"W0216 19:42:01.228572 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0216 19:42:01.228958 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771270921 cert, and key in /tmp/serving-cert-1169484463/serving-signer.crt, /tmp/serving-cert-1169484463/serving-signer.key\\\\nI0216 19:42:01.683881 1 observer_polling.go:159] Starting file observer\\\\nW0216 19:42:01.687387 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0216 19:42:01.693158 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0216 19:42:01.694799 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1169484463/tls.crt::/tmp/serving-cert-1169484463/tls.key\\\\\\\"\\\\nF0216 19:42:02.048422 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://344f8bd7bee340cc040a2556a0e92c055ed7fbad886f85750b66d21ace766e8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:54Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:54 crc kubenswrapper[4675]: I0216 19:42:54.484747 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3824464b-e8c8-4a8a-a50b-0ba14ad23d54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1eebdfea2b981083cc06feead15d85cc403b59837948716edf1a4649aff1bfc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eea7a27c1d937fcd2e1879b3eb16aacaf9733618822a2064c003a0e69828c4e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53ef6384f8ec4cb5976952d83be543b5a555d2ebcbe603a87889791803b1ae3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb0650dde5c6086d91ae092a0550467170f105f67c5cab5b0789f3c21d0c6966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb0650dde5c6086d91ae092a0550467170f105f67c5cab5b0789f3c21d0c6966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:54Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:54 crc kubenswrapper[4675]: I0216 19:42:54.507198 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18c7d7a1f00611bdd6074314992cb6601ce0c043b831b6125010fe098c7e12ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:54Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:54 crc kubenswrapper[4675]: I0216 19:42:54.530445 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:54Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:54 crc kubenswrapper[4675]: I0216 19:42:54.538558 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:54 crc kubenswrapper[4675]: I0216 19:42:54.538618 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:54 crc kubenswrapper[4675]: I0216 19:42:54.538637 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:54 crc kubenswrapper[4675]: I0216 19:42:54.538660 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:54 crc kubenswrapper[4675]: I0216 19:42:54.538678 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:54Z","lastTransitionTime":"2026-02-16T19:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:54 crc kubenswrapper[4675]: I0216 19:42:54.547302 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:54Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:54 crc kubenswrapper[4675]: I0216 19:42:54.574357 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pj5xg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9a99563-d631-455f-8464-160e5619c610\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://798b2a0b186213c744675eb2bd1a33fcb05a8df10795373dbff1f91222fa17a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s94dl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pj5xg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:54Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:54 crc kubenswrapper[4675]: I0216 19:42:54.605402 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rg8bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f385df3d-0543-4189-88a6-2163c8b9b959\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f034148fa1b45c35dd6c8b9b70c08b1d745d1377335d7f6d2b35793c29ce9e03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://993eb6f1a848a5279ad2e5d47f47c873918dbca13d2ff7db80bebbad6f948476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://993eb6f1a848a5279ad2e5d47f47c873918dbca13d2ff7db80bebbad6f948476\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02bb51b679cec1e9962a9d52b4fa890f31df7d9c00ed9f4885e94ac52455341c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02bb51b679cec1e9962a9d52b4fa890f31df7d9c00ed9f4885e94ac52455341c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0161d628c911b5e33369708f774f6f8313fb74d7fdc1266b7bfa346d5d6200a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0161d628c911b5e33369708f774f6f8313fb74d7fdc1266b7bfa346d5d6200a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://470da14b61b6a4b60e151726b8a40431c5a38f30416cb9e635e843541e8b28bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://470da14b61b6a4b60e151726b8a40431c5a38f30416cb9e635e843541e8b28bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd74b2f78bf4719b66ec38da6783f0e4ee8b93d27d6c27907212e906c1cd8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbd74b2f78bf4719b66ec38da6783f0e4ee8b93d27d6c27907212e906c1cd8e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f742c6f36d6a537e143339eaa367ff65398f680da2e175aeaed1a4da1886148c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f742c6f36d6a537e143339eaa367ff65398f680da2e175aeaed1a4da1886148c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rg8bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:54Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:54 crc kubenswrapper[4675]: I0216 19:42:54.623455 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4vvwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2de9472b-0e23-4821-86b8-202bfd739aee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2a601bdf26e518854cf3a55f3aed6274701bc40b20fe0f0e7673e65a007f0b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqw54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4vvwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:54Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:54 crc kubenswrapper[4675]: I0216 19:42:54.643342 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:54 crc kubenswrapper[4675]: I0216 19:42:54.643409 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:54 crc kubenswrapper[4675]: I0216 19:42:54.643425 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:54 crc kubenswrapper[4675]: I0216 19:42:54.643450 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:54 crc kubenswrapper[4675]: I0216 19:42:54.643471 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:54Z","lastTransitionTime":"2026-02-16T19:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:54 crc kubenswrapper[4675]: I0216 19:42:54.651724 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"776ad1bc-395c-41c3-8c95-37a58a8cd4e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04dbf4934ecd5c1773fe7c1d032d68d59f41556af022bdf1c36f8de85221616a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abb4dfa515be84263cef6de8c300018193409908292cbfe0d19610ade9532ad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8d3160c906648c15ea910b1b2fa9223c716bbefac62b25bba4f83f3b50217b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd0dd7a8d986a0dd78568e33afa36fcd8bba88cf885d29b22484132bf0034c47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:54Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:54 crc kubenswrapper[4675]: I0216 19:42:54.680159 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04c59a1a3ae53156b955a90113bc3d3b4424b7dcc3baa91e8256fd0893751af8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:54Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:54 crc kubenswrapper[4675]: I0216 19:42:54.701218 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sbgjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d5a5b47-38a4-4f7e-b40e-dba4825e18be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm4b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm4b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sbgjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:54Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:54 crc kubenswrapper[4675]: I0216 19:42:54.749667 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:54 crc kubenswrapper[4675]: I0216 19:42:54.749724 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:54 crc kubenswrapper[4675]: I0216 19:42:54.749735 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:54 crc kubenswrapper[4675]: I0216 19:42:54.749752 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:54 crc kubenswrapper[4675]: I0216 19:42:54.749763 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:54Z","lastTransitionTime":"2026-02-16T19:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:54 crc kubenswrapper[4675]: I0216 19:42:54.825761 4675 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 07:26:58.259683899 +0000 UTC Feb 16 19:42:54 crc kubenswrapper[4675]: I0216 19:42:54.852996 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:54 crc kubenswrapper[4675]: I0216 19:42:54.853060 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:54 crc kubenswrapper[4675]: I0216 19:42:54.853072 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:54 crc kubenswrapper[4675]: I0216 19:42:54.853089 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:54 crc kubenswrapper[4675]: I0216 19:42:54.853099 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:54Z","lastTransitionTime":"2026-02-16T19:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:54 crc kubenswrapper[4675]: I0216 19:42:54.883827 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbgjb" Feb 16 19:42:54 crc kubenswrapper[4675]: I0216 19:42:54.883873 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 19:42:54 crc kubenswrapper[4675]: I0216 19:42:54.883930 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 19:42:54 crc kubenswrapper[4675]: E0216 19:42:54.883997 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbgjb" podUID="8d5a5b47-38a4-4f7e-b40e-dba4825e18be" Feb 16 19:42:54 crc kubenswrapper[4675]: E0216 19:42:54.884104 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 19:42:54 crc kubenswrapper[4675]: E0216 19:42:54.884175 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 19:42:54 crc kubenswrapper[4675]: I0216 19:42:54.884245 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 19:42:54 crc kubenswrapper[4675]: E0216 19:42:54.884367 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 19:42:54 crc kubenswrapper[4675]: I0216 19:42:54.956259 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:54 crc kubenswrapper[4675]: I0216 19:42:54.956320 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:54 crc kubenswrapper[4675]: I0216 19:42:54.956333 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:54 crc kubenswrapper[4675]: I0216 19:42:54.956353 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:54 crc kubenswrapper[4675]: I0216 19:42:54.956367 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:54Z","lastTransitionTime":"2026-02-16T19:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:55 crc kubenswrapper[4675]: I0216 19:42:55.060181 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:55 crc kubenswrapper[4675]: I0216 19:42:55.060229 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:55 crc kubenswrapper[4675]: I0216 19:42:55.060243 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:55 crc kubenswrapper[4675]: I0216 19:42:55.060262 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:55 crc kubenswrapper[4675]: I0216 19:42:55.060278 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:55Z","lastTransitionTime":"2026-02-16T19:42:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:55 crc kubenswrapper[4675]: I0216 19:42:55.163050 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:55 crc kubenswrapper[4675]: I0216 19:42:55.163127 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:55 crc kubenswrapper[4675]: I0216 19:42:55.163148 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:55 crc kubenswrapper[4675]: I0216 19:42:55.163179 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:55 crc kubenswrapper[4675]: I0216 19:42:55.163205 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:55Z","lastTransitionTime":"2026-02-16T19:42:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:55 crc kubenswrapper[4675]: I0216 19:42:55.269123 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:55 crc kubenswrapper[4675]: I0216 19:42:55.269193 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:55 crc kubenswrapper[4675]: I0216 19:42:55.269212 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:55 crc kubenswrapper[4675]: I0216 19:42:55.269244 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:55 crc kubenswrapper[4675]: I0216 19:42:55.269262 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:55Z","lastTransitionTime":"2026-02-16T19:42:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:55 crc kubenswrapper[4675]: I0216 19:42:55.372153 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:55 crc kubenswrapper[4675]: I0216 19:42:55.372193 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:55 crc kubenswrapper[4675]: I0216 19:42:55.372204 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:55 crc kubenswrapper[4675]: I0216 19:42:55.372224 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:55 crc kubenswrapper[4675]: I0216 19:42:55.372234 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:55Z","lastTransitionTime":"2026-02-16T19:42:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:55 crc kubenswrapper[4675]: I0216 19:42:55.475267 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:55 crc kubenswrapper[4675]: I0216 19:42:55.475317 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:55 crc kubenswrapper[4675]: I0216 19:42:55.475339 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:55 crc kubenswrapper[4675]: I0216 19:42:55.475358 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:55 crc kubenswrapper[4675]: I0216 19:42:55.475371 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:55Z","lastTransitionTime":"2026-02-16T19:42:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:55 crc kubenswrapper[4675]: I0216 19:42:55.577990 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:55 crc kubenswrapper[4675]: I0216 19:42:55.578065 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:55 crc kubenswrapper[4675]: I0216 19:42:55.578086 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:55 crc kubenswrapper[4675]: I0216 19:42:55.578116 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:55 crc kubenswrapper[4675]: I0216 19:42:55.578138 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:55Z","lastTransitionTime":"2026-02-16T19:42:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:55 crc kubenswrapper[4675]: I0216 19:42:55.681575 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:55 crc kubenswrapper[4675]: I0216 19:42:55.681664 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:55 crc kubenswrapper[4675]: I0216 19:42:55.681731 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:55 crc kubenswrapper[4675]: I0216 19:42:55.681765 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:55 crc kubenswrapper[4675]: I0216 19:42:55.681793 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:55Z","lastTransitionTime":"2026-02-16T19:42:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:55 crc kubenswrapper[4675]: I0216 19:42:55.785561 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:55 crc kubenswrapper[4675]: I0216 19:42:55.785653 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:55 crc kubenswrapper[4675]: I0216 19:42:55.785674 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:55 crc kubenswrapper[4675]: I0216 19:42:55.785738 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:55 crc kubenswrapper[4675]: I0216 19:42:55.785760 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:55Z","lastTransitionTime":"2026-02-16T19:42:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:55 crc kubenswrapper[4675]: I0216 19:42:55.826395 4675 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 03:43:15.7620269 +0000 UTC Feb 16 19:42:55 crc kubenswrapper[4675]: I0216 19:42:55.897604 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:55 crc kubenswrapper[4675]: I0216 19:42:55.897674 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:55 crc kubenswrapper[4675]: I0216 19:42:55.897722 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:55 crc kubenswrapper[4675]: I0216 19:42:55.897747 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:55 crc kubenswrapper[4675]: I0216 19:42:55.897769 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:55Z","lastTransitionTime":"2026-02-16T19:42:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.000577 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.000649 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.000669 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.000725 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.000745 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:56Z","lastTransitionTime":"2026-02-16T19:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.071824 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.071877 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.071897 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.071917 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.071933 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:56Z","lastTransitionTime":"2026-02-16T19:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:56 crc kubenswrapper[4675]: E0216 19:42:56.093641 4675 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:42:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:42:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:42:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:42:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dc074e08-7429-4b93-941d-3d8335487f37\\\",\\\"systemUUID\\\":\\\"a98cec76-f68c-4af0-9517-b326e9616347\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:56Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.100014 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.100062 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.100080 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.100100 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.100118 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:56Z","lastTransitionTime":"2026-02-16T19:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:56 crc kubenswrapper[4675]: E0216 19:42:56.122941 4675 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:42:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:42:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:42:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:42:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dc074e08-7429-4b93-941d-3d8335487f37\\\",\\\"systemUUID\\\":\\\"a98cec76-f68c-4af0-9517-b326e9616347\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:56Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.128771 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.128889 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.128954 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.128999 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.129070 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:56Z","lastTransitionTime":"2026-02-16T19:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:56 crc kubenswrapper[4675]: E0216 19:42:56.151777 4675 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:42:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:42:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:42:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:42:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dc074e08-7429-4b93-941d-3d8335487f37\\\",\\\"systemUUID\\\":\\\"a98cec76-f68c-4af0-9517-b326e9616347\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:56Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.157549 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.157606 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.157623 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.157649 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.157666 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:56Z","lastTransitionTime":"2026-02-16T19:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:56 crc kubenswrapper[4675]: E0216 19:42:56.179350 4675 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:42:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:42:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:42:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:42:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dc074e08-7429-4b93-941d-3d8335487f37\\\",\\\"systemUUID\\\":\\\"a98cec76-f68c-4af0-9517-b326e9616347\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:56Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.184877 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.184925 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.184942 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.184968 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.184986 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:56Z","lastTransitionTime":"2026-02-16T19:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:56 crc kubenswrapper[4675]: E0216 19:42:56.201155 4675 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:42:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:42:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:42:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:42:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dc074e08-7429-4b93-941d-3d8335487f37\\\",\\\"systemUUID\\\":\\\"a98cec76-f68c-4af0-9517-b326e9616347\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:56Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:56 crc kubenswrapper[4675]: E0216 19:42:56.201371 4675 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.203594 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.203646 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.203658 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.203677 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.203894 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:56Z","lastTransitionTime":"2026-02-16T19:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.300338 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gpc5z_9b6e2d5a-0472-425b-b5b4-0b94f14ebfba/ovnkube-controller/2.log" Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.301763 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gpc5z_9b6e2d5a-0472-425b-b5b4-0b94f14ebfba/ovnkube-controller/1.log" Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.309197 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.309278 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.309303 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.309335 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.309357 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:56Z","lastTransitionTime":"2026-02-16T19:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.311170 4675 generic.go:334] "Generic (PLEG): container finished" podID="9b6e2d5a-0472-425b-b5b4-0b94f14ebfba" containerID="5452aa74eda5605a3b5ad24d32c48b20ec9f2c479d5dd388e4307b79245e8f4b" exitCode=1 Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.311228 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" event={"ID":"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba","Type":"ContainerDied","Data":"5452aa74eda5605a3b5ad24d32c48b20ec9f2c479d5dd388e4307b79245e8f4b"} Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.311294 4675 scope.go:117] "RemoveContainer" containerID="2b63e56e590054687fedf2ed3e34d18b7fe5b675ab245cc644b18461752082d9" Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.313354 4675 scope.go:117] "RemoveContainer" containerID="5452aa74eda5605a3b5ad24d32c48b20ec9f2c479d5dd388e4307b79245e8f4b" Feb 16 19:42:56 crc kubenswrapper[4675]: E0216 19:42:56.313785 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-gpc5z_openshift-ovn-kubernetes(9b6e2d5a-0472-425b-b5b4-0b94f14ebfba)\"" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" podUID="9b6e2d5a-0472-425b-b5b4-0b94f14ebfba" Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.336001 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10e2a1727464113d2cd88dfb8ba52bb1944b3dfe67a0d40f5b2e57fe483e5890\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83a4afeccd65f013b07c0f90cbb94f765f918aaaeeca716935f940deb85403d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:56Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.358598 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-stxk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af599569-feb0-432a-9adf-1b72d1ac1a57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5944f34bb922394197cfb79978d19139f70724115954043828e9f609ca32945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-stxk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:56Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.376854 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h9s8p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e98c1ed8-cbed-4d30-b3fc-41beff5d65c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://410f4deab4b31698f80da30412dff846ca9f1004dfab5d4cfe67c6892b3a7aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df2gr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20a2a1cc0b69b60d38a38ab898236d8577ce735cd2e1698f6fd3cb3d3c23b62c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df2gr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-h9s8p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:56Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.409155 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"358c3bab-4666-44d1-90fb-3affa22327e2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e0dcd3707a91b9a3869942408c144f0b4ca22cd64e99de5fe43f95094e44a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d3a4d507eb459edf9f2a3c58417ae320eba85a7b597808fdb0eabe7d7d5afbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd404f7efe7c09d47f0550b7cf66ac7193f315f60db74cd5609593546037c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bfda6b7519474256ab63750ba86955592a2e47a0e364e4170a19fe28270459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e9a715621e072260b3d9703159db5eaad4918ff7a2cb7689dcb914ccd617950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e8bfd47afcf3bbd1831b98e69a3ee5fe7b49dd04d4f2efc9d9dc8fda831c74a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e8bfd47afcf3bbd1831b98e69a3ee5fe7b49dd04d4f2efc9d9dc8fda831c74a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65752e9eeab5fea26d84c7d43a33947d3dbabdd31578480904db24d60d298a0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65752e9eeab5fea26d84c7d43a33947d3dbabdd31578480904db24d60d298a0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bd389e534c9b05c81db5cfb0a9f1f5e78e2f1ccbcce15953172836fb3fb6744f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd389e534c9b05c81db5cfb0a9f1f5e78e2f1ccbcce15953172836fb3fb6744f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:56Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.412861 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.412992 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.413099 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.413183 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.413373 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:56Z","lastTransitionTime":"2026-02-16T19:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.427874 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:56Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.444987 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pj5xg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9a99563-d631-455f-8464-160e5619c610\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://798b2a0b186213c744675eb2bd1a33fcb05a8df10795373dbff1f91222fa17a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s94dl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pj5xg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:56Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.501984 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://958d71da673d02ed4c067f390a0a94c9f20052f6274ec975f9a4c3034efa7aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5282559b57497e15c0bd828702b4994e8d05f7c1a81ee44e582ce3bb3b64cbb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13af49ba7abbe43edde46130d212f6d76f30bab85a63aef18a84721887e29535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e22795f3990f90b901e1ba026e5f95e72508dc865635d19ba881a291dbaed85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9296772284fdb3e0ac56ffd5e54f10e7fe1256ccb616d52f103479ebcc6e81f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ace6ee814e8e429b6ec77e88a0bb2fff35cfbfd3497e7a4d7882e11f437ce5a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5452aa74eda5605a3b5ad24d32c48b20ec9f2c479d5dd388e4307b79245e8f4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b63e56e590054687fedf2ed3e34d18b7fe5b675ab245cc644b18461752082d9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-16T19:42:34Z\\\",\\\"message\\\":\\\"per:true service.alpha.openshift.io/serving-cert-secret-name:config-operator-serving-cert service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc006f7818f \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 8443 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app: openshift-config-operator,},ClusterIP:10.217.4.161,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.161],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0216 19:42:34.165060 6088 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI0216 19:42:34.165067 6088 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5452aa74eda5605a3b5ad24d32c48b20ec9f2c479d5dd388e4307b79245e8f4b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-16T19:42:55Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0216 19:42:55.399413 6340 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0216 19:42:55.399442 6340 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0216 19:42:55.399468 6340 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0216 19:42:55.399515 6340 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0216 19:42:55.399545 6340 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0216 19:42:55.399552 6340 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0216 19:42:55.399802 6340 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0216 19:42:55.399864 6340 factory.go:656] Stopping watch factory\\\\nI0216 19:42:55.399912 6340 ovnkube.go:599] Stopped ovnkube\\\\nI0216 19:42:55.399629 6340 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0216 19:42:55.399637 6340 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0216 19:42:55.399654 6340 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0216 19:42:55.399993 6340 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0216 19:42:55.400004 6340 handler.go:208] Removed *v1.Node event handler 7\\\\nI0216 19:42:55.400012 6340 handler.go:208] Removed *v1.Node event handler 2\\\\nI0216 19:42:5\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4a449e69885bb506900d7dbb14ee3ff9eb2d1188267babb04fbe0bb8f057c48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gpc5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:56Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.516493 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.516547 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.516566 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.516594 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.516614 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:56Z","lastTransitionTime":"2026-02-16T19:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.532400 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10414964-83d0-4d95-a89f-e3212a8015b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db3c34dee6cb538e5de05d8703350e3013401589489756aafbf8fed2ef597d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27l4j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92e33c01bc9214841f42d4ca6e58fb062a9e94fe39bdc5ad08ad2351cd270058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27l4j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j7pnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:56Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.564538 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc6bac98-79c6-4970-ba8b-7996a4046f7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055f2d1052fed3e5e8d660243c3d0595e3168a670ea7a22ef51b199528ffe826\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beefddcc0b5e2e84c15063f467b9043ba9caea11fbc3f7fb8fc1ece3b8c8f75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dbd2e87a95c7964a1bafb765ec76d7f18e180cc0e47df9fde7b526b3717a107\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cc4f98152f284333d41071adaefada973e3396c3cdb2c6ee3c662f0043fe65b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c762d1facdba3325880ae6b1e2704c626ae67915808b527959795774e2a1ba62\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-16T19:42:02Z\\\",\\\"message\\\":\\\"W0216 19:42:01.228572 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0216 19:42:01.228958 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771270921 cert, and key in /tmp/serving-cert-1169484463/serving-signer.crt, /tmp/serving-cert-1169484463/serving-signer.key\\\\nI0216 19:42:01.683881 1 observer_polling.go:159] Starting file observer\\\\nW0216 19:42:01.687387 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0216 19:42:01.693158 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0216 19:42:01.694799 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1169484463/tls.crt::/tmp/serving-cert-1169484463/tls.key\\\\\\\"\\\\nF0216 19:42:02.048422 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://344f8bd7bee340cc040a2556a0e92c055ed7fbad886f85750b66d21ace766e8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:56Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.583553 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3824464b-e8c8-4a8a-a50b-0ba14ad23d54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1eebdfea2b981083cc06feead15d85cc403b59837948716edf1a4649aff1bfc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eea7a27c1d937fcd2e1879b3eb16aacaf9733618822a2064c003a0e69828c4e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53ef6384f8ec4cb5976952d83be543b5a555d2ebcbe603a87889791803b1ae3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb0650dde5c6086d91ae092a0550467170f105f67c5cab5b0789f3c21d0c6966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb0650dde5c6086d91ae092a0550467170f105f67c5cab5b0789f3c21d0c6966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:56Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.600906 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18c7d7a1f00611bdd6074314992cb6601ce0c043b831b6125010fe098c7e12ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:56Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.619108 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.619234 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.619337 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.619407 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.619463 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:56Z","lastTransitionTime":"2026-02-16T19:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.620560 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:56Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.638742 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:56Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.662742 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rg8bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f385df3d-0543-4189-88a6-2163c8b9b959\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f034148fa1b45c35dd6c8b9b70c08b1d745d1377335d7f6d2b35793c29ce9e03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://993eb6f1a848a5279ad2e5d47f47c873918dbca13d2ff7db80bebbad6f948476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://993eb6f1a848a5279ad2e5d47f47c873918dbca13d2ff7db80bebbad6f948476\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02bb51b679cec1e9962a9d52b4fa890f31df7d9c00ed9f4885e94ac52455341c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02bb51b679cec1e9962a9d52b4fa890f31df7d9c00ed9f4885e94ac52455341c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0161d628c911b5e33369708f774f6f8313fb74d7fdc1266b7bfa346d5d6200a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0161d628c911b5e33369708f774f6f8313fb74d7fdc1266b7bfa346d5d6200a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://470da14b61b6a4b60e151726b8a40431c5a38f30416cb9e635e843541e8b28bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://470da14b61b6a4b60e151726b8a40431c5a38f30416cb9e635e843541e8b28bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd74b2f78bf4719b66ec38da6783f0e4ee8b93d27d6c27907212e906c1cd8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbd74b2f78bf4719b66ec38da6783f0e4ee8b93d27d6c27907212e906c1cd8e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f742c6f36d6a537e143339eaa367ff65398f680da2e175aeaed1a4da1886148c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f742c6f36d6a537e143339eaa367ff65398f680da2e175aeaed1a4da1886148c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rg8bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:56Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.678152 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4vvwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2de9472b-0e23-4821-86b8-202bfd739aee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2a601bdf26e518854cf3a55f3aed6274701bc40b20fe0f0e7673e65a007f0b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqw54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4vvwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:56Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.702010 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"776ad1bc-395c-41c3-8c95-37a58a8cd4e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04dbf4934ecd5c1773fe7c1d032d68d59f41556af022bdf1c36f8de85221616a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abb4dfa515be84263cef6de8c300018193409908292cbfe0d19610ade9532ad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8d3160c906648c15ea910b1b2fa9223c716bbefac62b25bba4f83f3b50217b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd0dd7a8d986a0dd78568e33afa36fcd8bba88cf885d29b22484132bf0034c47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:56Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.722520 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.722591 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.722612 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.722427 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04c59a1a3ae53156b955a90113bc3d3b4424b7dcc3baa91e8256fd0893751af8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:56Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.722638 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.722931 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:56Z","lastTransitionTime":"2026-02-16T19:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.741551 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sbgjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d5a5b47-38a4-4f7e-b40e-dba4825e18be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm4b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm4b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sbgjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:56Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.827047 4675 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 12:39:49.927324773 +0000 UTC Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.827295 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.828764 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.828961 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.829330 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.829664 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:56Z","lastTransitionTime":"2026-02-16T19:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.883923 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.884076 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.884078 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 19:42:56 crc kubenswrapper[4675]: E0216 19:42:56.884265 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 19:42:56 crc kubenswrapper[4675]: E0216 19:42:56.884412 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.884438 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbgjb" Feb 16 19:42:56 crc kubenswrapper[4675]: E0216 19:42:56.884584 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 19:42:56 crc kubenswrapper[4675]: E0216 19:42:56.884815 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbgjb" podUID="8d5a5b47-38a4-4f7e-b40e-dba4825e18be" Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.933617 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.934045 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.934196 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.934308 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:56 crc kubenswrapper[4675]: I0216 19:42:56.934371 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:56Z","lastTransitionTime":"2026-02-16T19:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:57 crc kubenswrapper[4675]: I0216 19:42:57.038196 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:57 crc kubenswrapper[4675]: I0216 19:42:57.038547 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:57 crc kubenswrapper[4675]: I0216 19:42:57.038643 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:57 crc kubenswrapper[4675]: I0216 19:42:57.038767 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:57 crc kubenswrapper[4675]: I0216 19:42:57.039088 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:57Z","lastTransitionTime":"2026-02-16T19:42:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:57 crc kubenswrapper[4675]: I0216 19:42:57.142166 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:57 crc kubenswrapper[4675]: I0216 19:42:57.142614 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:57 crc kubenswrapper[4675]: I0216 19:42:57.142786 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:57 crc kubenswrapper[4675]: I0216 19:42:57.142985 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:57 crc kubenswrapper[4675]: I0216 19:42:57.143145 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:57Z","lastTransitionTime":"2026-02-16T19:42:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:57 crc kubenswrapper[4675]: I0216 19:42:57.246276 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:57 crc kubenswrapper[4675]: I0216 19:42:57.246340 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:57 crc kubenswrapper[4675]: I0216 19:42:57.246357 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:57 crc kubenswrapper[4675]: I0216 19:42:57.246383 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:57 crc kubenswrapper[4675]: I0216 19:42:57.246401 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:57Z","lastTransitionTime":"2026-02-16T19:42:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:57 crc kubenswrapper[4675]: I0216 19:42:57.318684 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gpc5z_9b6e2d5a-0472-425b-b5b4-0b94f14ebfba/ovnkube-controller/2.log" Feb 16 19:42:57 crc kubenswrapper[4675]: I0216 19:42:57.349663 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:57 crc kubenswrapper[4675]: I0216 19:42:57.349815 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:57 crc kubenswrapper[4675]: I0216 19:42:57.349837 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:57 crc kubenswrapper[4675]: I0216 19:42:57.349867 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:57 crc kubenswrapper[4675]: I0216 19:42:57.349889 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:57Z","lastTransitionTime":"2026-02-16T19:42:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:57 crc kubenswrapper[4675]: I0216 19:42:57.453386 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:57 crc kubenswrapper[4675]: I0216 19:42:57.453457 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:57 crc kubenswrapper[4675]: I0216 19:42:57.453479 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:57 crc kubenswrapper[4675]: I0216 19:42:57.453506 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:57 crc kubenswrapper[4675]: I0216 19:42:57.453524 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:57Z","lastTransitionTime":"2026-02-16T19:42:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:57 crc kubenswrapper[4675]: I0216 19:42:57.557076 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:57 crc kubenswrapper[4675]: I0216 19:42:57.557155 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:57 crc kubenswrapper[4675]: I0216 19:42:57.557175 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:57 crc kubenswrapper[4675]: I0216 19:42:57.557205 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:57 crc kubenswrapper[4675]: I0216 19:42:57.557224 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:57Z","lastTransitionTime":"2026-02-16T19:42:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:57 crc kubenswrapper[4675]: I0216 19:42:57.661005 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:57 crc kubenswrapper[4675]: I0216 19:42:57.661067 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:57 crc kubenswrapper[4675]: I0216 19:42:57.661084 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:57 crc kubenswrapper[4675]: I0216 19:42:57.661109 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:57 crc kubenswrapper[4675]: I0216 19:42:57.661124 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:57Z","lastTransitionTime":"2026-02-16T19:42:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:57 crc kubenswrapper[4675]: I0216 19:42:57.764290 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:57 crc kubenswrapper[4675]: I0216 19:42:57.764385 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:57 crc kubenswrapper[4675]: I0216 19:42:57.764405 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:57 crc kubenswrapper[4675]: I0216 19:42:57.764434 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:57 crc kubenswrapper[4675]: I0216 19:42:57.764453 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:57Z","lastTransitionTime":"2026-02-16T19:42:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:57 crc kubenswrapper[4675]: I0216 19:42:57.828923 4675 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 08:45:43.657703847 +0000 UTC Feb 16 19:42:57 crc kubenswrapper[4675]: I0216 19:42:57.867698 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:57 crc kubenswrapper[4675]: I0216 19:42:57.867775 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:57 crc kubenswrapper[4675]: I0216 19:42:57.867787 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:57 crc kubenswrapper[4675]: I0216 19:42:57.867806 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:57 crc kubenswrapper[4675]: I0216 19:42:57.867820 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:57Z","lastTransitionTime":"2026-02-16T19:42:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:57 crc kubenswrapper[4675]: I0216 19:42:57.901934 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10414964-83d0-4d95-a89f-e3212a8015b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db3c34dee6cb538e5de05d8703350e3013401589489756aafbf8fed2ef597d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27l4j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92e33c01bc9214841f42d4ca6e58fb062a9e94fe39bdc5ad08ad2351cd270058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27l4j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j7pnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:57Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:57 crc kubenswrapper[4675]: I0216 19:42:57.921604 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc6bac98-79c6-4970-ba8b-7996a4046f7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055f2d1052fed3e5e8d660243c3d0595e3168a670ea7a22ef51b199528ffe826\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beefddcc0b5e2e84c15063f467b9043ba9caea11fbc3f7fb8fc1ece3b8c8f75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dbd2e87a95c7964a1bafb765ec76d7f18e180cc0e47df9fde7b526b3717a107\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cc4f98152f284333d41071adaefada973e3396c3cdb2c6ee3c662f0043fe65b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c762d1facdba3325880ae6b1e2704c626ae67915808b527959795774e2a1ba62\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-16T19:42:02Z\\\",\\\"message\\\":\\\"W0216 19:42:01.228572 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0216 19:42:01.228958 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771270921 cert, and key in /tmp/serving-cert-1169484463/serving-signer.crt, /tmp/serving-cert-1169484463/serving-signer.key\\\\nI0216 19:42:01.683881 1 observer_polling.go:159] Starting file observer\\\\nW0216 19:42:01.687387 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0216 19:42:01.693158 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0216 19:42:01.694799 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1169484463/tls.crt::/tmp/serving-cert-1169484463/tls.key\\\\\\\"\\\\nF0216 19:42:02.048422 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://344f8bd7bee340cc040a2556a0e92c055ed7fbad886f85750b66d21ace766e8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:57Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:57 crc kubenswrapper[4675]: I0216 19:42:57.941590 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3824464b-e8c8-4a8a-a50b-0ba14ad23d54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1eebdfea2b981083cc06feead15d85cc403b59837948716edf1a4649aff1bfc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eea7a27c1d937fcd2e1879b3eb16aacaf9733618822a2064c003a0e69828c4e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53ef6384f8ec4cb5976952d83be543b5a555d2ebcbe603a87889791803b1ae3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb0650dde5c6086d91ae092a0550467170f105f67c5cab5b0789f3c21d0c6966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb0650dde5c6086d91ae092a0550467170f105f67c5cab5b0789f3c21d0c6966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:57Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:57 crc kubenswrapper[4675]: I0216 19:42:57.965379 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18c7d7a1f00611bdd6074314992cb6601ce0c043b831b6125010fe098c7e12ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:57Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:57 crc kubenswrapper[4675]: I0216 19:42:57.971188 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:57 crc kubenswrapper[4675]: I0216 19:42:57.971235 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:57 crc kubenswrapper[4675]: I0216 19:42:57.971245 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:57 crc kubenswrapper[4675]: I0216 19:42:57.971263 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:57 crc kubenswrapper[4675]: I0216 19:42:57.971274 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:57Z","lastTransitionTime":"2026-02-16T19:42:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:57 crc kubenswrapper[4675]: I0216 19:42:57.985078 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:57Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:58 crc kubenswrapper[4675]: I0216 19:42:58.004438 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:58Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:58 crc kubenswrapper[4675]: I0216 19:42:58.026738 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pj5xg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9a99563-d631-455f-8464-160e5619c610\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://798b2a0b186213c744675eb2bd1a33fcb05a8df10795373dbff1f91222fa17a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s94dl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pj5xg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:58Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:58 crc kubenswrapper[4675]: I0216 19:42:58.064732 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://958d71da673d02ed4c067f390a0a94c9f20052f6274ec975f9a4c3034efa7aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5282559b57497e15c0bd828702b4994e8d05f7c1a81ee44e582ce3bb3b64cbb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13af49ba7abbe43edde46130d212f6d76f30bab85a63aef18a84721887e29535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e22795f3990f90b901e1ba026e5f95e72508dc865635d19ba881a291dbaed85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9296772284fdb3e0ac56ffd5e54f10e7fe1256ccb616d52f103479ebcc6e81f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ace6ee814e8e429b6ec77e88a0bb2fff35cfbfd3497e7a4d7882e11f437ce5a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5452aa74eda5605a3b5ad24d32c48b20ec9f2c479d5dd388e4307b79245e8f4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b63e56e590054687fedf2ed3e34d18b7fe5b675ab245cc644b18461752082d9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-16T19:42:34Z\\\",\\\"message\\\":\\\"per:true service.alpha.openshift.io/serving-cert-secret-name:config-operator-serving-cert service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc006f7818f \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 8443 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app: openshift-config-operator,},ClusterIP:10.217.4.161,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.161],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0216 19:42:34.165060 6088 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI0216 19:42:34.165067 6088 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5452aa74eda5605a3b5ad24d32c48b20ec9f2c479d5dd388e4307b79245e8f4b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-16T19:42:55Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0216 19:42:55.399413 6340 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0216 19:42:55.399442 6340 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0216 19:42:55.399468 6340 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0216 19:42:55.399515 6340 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0216 19:42:55.399545 6340 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0216 19:42:55.399552 6340 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0216 19:42:55.399802 6340 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0216 19:42:55.399864 6340 factory.go:656] Stopping watch factory\\\\nI0216 19:42:55.399912 6340 ovnkube.go:599] Stopped ovnkube\\\\nI0216 19:42:55.399629 6340 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0216 19:42:55.399637 6340 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0216 19:42:55.399654 6340 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0216 19:42:55.399993 6340 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0216 19:42:55.400004 6340 handler.go:208] Removed *v1.Node event handler 7\\\\nI0216 19:42:55.400012 6340 handler.go:208] Removed *v1.Node event handler 2\\\\nI0216 19:42:5\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4a449e69885bb506900d7dbb14ee3ff9eb2d1188267babb04fbe0bb8f057c48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gpc5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:58Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:58 crc kubenswrapper[4675]: I0216 19:42:58.074889 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:58 crc kubenswrapper[4675]: I0216 19:42:58.074947 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:58 crc kubenswrapper[4675]: I0216 19:42:58.074969 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:58 crc kubenswrapper[4675]: I0216 19:42:58.075000 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:58 crc kubenswrapper[4675]: I0216 19:42:58.075021 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:58Z","lastTransitionTime":"2026-02-16T19:42:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:58 crc kubenswrapper[4675]: I0216 19:42:58.090518 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rg8bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f385df3d-0543-4189-88a6-2163c8b9b959\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f034148fa1b45c35dd6c8b9b70c08b1d745d1377335d7f6d2b35793c29ce9e03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://993eb6f1a848a5279ad2e5d47f47c873918dbca13d2ff7db80bebbad6f948476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://993eb6f1a848a5279ad2e5d47f47c873918dbca13d2ff7db80bebbad6f948476\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02bb51b679cec1e9962a9d52b4fa890f31df7d9c00ed9f4885e94ac52455341c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02bb51b679cec1e9962a9d52b4fa890f31df7d9c00ed9f4885e94ac52455341c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0161d628c911b5e33369708f774f6f8313fb74d7fdc1266b7bfa346d5d6200a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0161d628c911b5e33369708f774f6f8313fb74d7fdc1266b7bfa346d5d6200a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://470da14b61b6a4b60e151726b8a40431c5a38f30416cb9e635e843541e8b28bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://470da14b61b6a4b60e151726b8a40431c5a38f30416cb9e635e843541e8b28bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd74b2f78bf4719b66ec38da6783f0e4ee8b93d27d6c27907212e906c1cd8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbd74b2f78bf4719b66ec38da6783f0e4ee8b93d27d6c27907212e906c1cd8e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f742c6f36d6a537e143339eaa367ff65398f680da2e175aeaed1a4da1886148c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f742c6f36d6a537e143339eaa367ff65398f680da2e175aeaed1a4da1886148c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rg8bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:58Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:58 crc kubenswrapper[4675]: I0216 19:42:58.108589 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4vvwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2de9472b-0e23-4821-86b8-202bfd739aee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2a601bdf26e518854cf3a55f3aed6274701bc40b20fe0f0e7673e65a007f0b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqw54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4vvwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:58Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:58 crc kubenswrapper[4675]: I0216 19:42:58.127167 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"776ad1bc-395c-41c3-8c95-37a58a8cd4e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04dbf4934ecd5c1773fe7c1d032d68d59f41556af022bdf1c36f8de85221616a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abb4dfa515be84263cef6de8c300018193409908292cbfe0d19610ade9532ad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8d3160c906648c15ea910b1b2fa9223c716bbefac62b25bba4f83f3b50217b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd0dd7a8d986a0dd78568e33afa36fcd8bba88cf885d29b22484132bf0034c47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:58Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:58 crc kubenswrapper[4675]: I0216 19:42:58.147587 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04c59a1a3ae53156b955a90113bc3d3b4424b7dcc3baa91e8256fd0893751af8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:58Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:58 crc kubenswrapper[4675]: I0216 19:42:58.163215 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sbgjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d5a5b47-38a4-4f7e-b40e-dba4825e18be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm4b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm4b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sbgjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:58Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:58 crc kubenswrapper[4675]: I0216 19:42:58.178093 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:58 crc kubenswrapper[4675]: I0216 19:42:58.178172 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:58 crc kubenswrapper[4675]: I0216 19:42:58.178195 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:58 crc kubenswrapper[4675]: I0216 19:42:58.178223 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:58 crc kubenswrapper[4675]: I0216 19:42:58.178242 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:58Z","lastTransitionTime":"2026-02-16T19:42:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:58 crc kubenswrapper[4675]: I0216 19:42:58.187371 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10e2a1727464113d2cd88dfb8ba52bb1944b3dfe67a0d40f5b2e57fe483e5890\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83a4afeccd65f013b07c0f90cbb94f765f918aaaeeca716935f940deb85403d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:58Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:58 crc kubenswrapper[4675]: I0216 19:42:58.206141 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-stxk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af599569-feb0-432a-9adf-1b72d1ac1a57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5944f34bb922394197cfb79978d19139f70724115954043828e9f609ca32945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-stxk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:58Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:58 crc kubenswrapper[4675]: I0216 19:42:58.220126 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h9s8p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e98c1ed8-cbed-4d30-b3fc-41beff5d65c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://410f4deab4b31698f80da30412dff846ca9f1004dfab5d4cfe67c6892b3a7aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df2gr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20a2a1cc0b69b60d38a38ab898236d8577ce735cd2e1698f6fd3cb3d3c23b62c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df2gr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-h9s8p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:58Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:58 crc kubenswrapper[4675]: I0216 19:42:58.251741 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"358c3bab-4666-44d1-90fb-3affa22327e2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e0dcd3707a91b9a3869942408c144f0b4ca22cd64e99de5fe43f95094e44a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d3a4d507eb459edf9f2a3c58417ae320eba85a7b597808fdb0eabe7d7d5afbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd404f7efe7c09d47f0550b7cf66ac7193f315f60db74cd5609593546037c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bfda6b7519474256ab63750ba86955592a2e47a0e364e4170a19fe28270459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e9a715621e072260b3d9703159db5eaad4918ff7a2cb7689dcb914ccd617950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e8bfd47afcf3bbd1831b98e69a3ee5fe7b49dd04d4f2efc9d9dc8fda831c74a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e8bfd47afcf3bbd1831b98e69a3ee5fe7b49dd04d4f2efc9d9dc8fda831c74a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65752e9eeab5fea26d84c7d43a33947d3dbabdd31578480904db24d60d298a0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65752e9eeab5fea26d84c7d43a33947d3dbabdd31578480904db24d60d298a0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bd389e534c9b05c81db5cfb0a9f1f5e78e2f1ccbcce15953172836fb3fb6744f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd389e534c9b05c81db5cfb0a9f1f5e78e2f1ccbcce15953172836fb3fb6744f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:58Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:58 crc kubenswrapper[4675]: I0216 19:42:58.269259 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:42:58Z is after 2025-08-24T17:21:41Z" Feb 16 19:42:58 crc kubenswrapper[4675]: I0216 19:42:58.281264 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:58 crc kubenswrapper[4675]: I0216 19:42:58.281477 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:58 crc kubenswrapper[4675]: I0216 19:42:58.281570 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:58 crc kubenswrapper[4675]: I0216 19:42:58.281673 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:58 crc kubenswrapper[4675]: I0216 19:42:58.281802 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:58Z","lastTransitionTime":"2026-02-16T19:42:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:58 crc kubenswrapper[4675]: I0216 19:42:58.384731 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:58 crc kubenswrapper[4675]: I0216 19:42:58.384771 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:58 crc kubenswrapper[4675]: I0216 19:42:58.384781 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:58 crc kubenswrapper[4675]: I0216 19:42:58.384799 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:58 crc kubenswrapper[4675]: I0216 19:42:58.384809 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:58Z","lastTransitionTime":"2026-02-16T19:42:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:58 crc kubenswrapper[4675]: I0216 19:42:58.489158 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:58 crc kubenswrapper[4675]: I0216 19:42:58.489225 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:58 crc kubenswrapper[4675]: I0216 19:42:58.489242 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:58 crc kubenswrapper[4675]: I0216 19:42:58.489270 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:58 crc kubenswrapper[4675]: I0216 19:42:58.489287 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:58Z","lastTransitionTime":"2026-02-16T19:42:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:58 crc kubenswrapper[4675]: I0216 19:42:58.592410 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:58 crc kubenswrapper[4675]: I0216 19:42:58.592457 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:58 crc kubenswrapper[4675]: I0216 19:42:58.592469 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:58 crc kubenswrapper[4675]: I0216 19:42:58.592488 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:58 crc kubenswrapper[4675]: I0216 19:42:58.592502 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:58Z","lastTransitionTime":"2026-02-16T19:42:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:58 crc kubenswrapper[4675]: I0216 19:42:58.696363 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:58 crc kubenswrapper[4675]: I0216 19:42:58.696425 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:58 crc kubenswrapper[4675]: I0216 19:42:58.696442 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:58 crc kubenswrapper[4675]: I0216 19:42:58.696469 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:58 crc kubenswrapper[4675]: I0216 19:42:58.696489 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:58Z","lastTransitionTime":"2026-02-16T19:42:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:58 crc kubenswrapper[4675]: I0216 19:42:58.799834 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:58 crc kubenswrapper[4675]: I0216 19:42:58.799902 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:58 crc kubenswrapper[4675]: I0216 19:42:58.799923 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:58 crc kubenswrapper[4675]: I0216 19:42:58.799949 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:58 crc kubenswrapper[4675]: I0216 19:42:58.799968 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:58Z","lastTransitionTime":"2026-02-16T19:42:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:58 crc kubenswrapper[4675]: I0216 19:42:58.830611 4675 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-11 10:04:21.210916618 +0000 UTC Feb 16 19:42:58 crc kubenswrapper[4675]: I0216 19:42:58.883990 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 19:42:58 crc kubenswrapper[4675]: I0216 19:42:58.883995 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbgjb" Feb 16 19:42:58 crc kubenswrapper[4675]: I0216 19:42:58.884161 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 19:42:58 crc kubenswrapper[4675]: E0216 19:42:58.884299 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 19:42:58 crc kubenswrapper[4675]: I0216 19:42:58.884342 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 19:42:58 crc kubenswrapper[4675]: E0216 19:42:58.884560 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 19:42:58 crc kubenswrapper[4675]: E0216 19:42:58.884561 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 19:42:58 crc kubenswrapper[4675]: E0216 19:42:58.884618 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbgjb" podUID="8d5a5b47-38a4-4f7e-b40e-dba4825e18be" Feb 16 19:42:58 crc kubenswrapper[4675]: I0216 19:42:58.902666 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:58 crc kubenswrapper[4675]: I0216 19:42:58.902717 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:58 crc kubenswrapper[4675]: I0216 19:42:58.902732 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:58 crc kubenswrapper[4675]: I0216 19:42:58.902748 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:58 crc kubenswrapper[4675]: I0216 19:42:58.902760 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:58Z","lastTransitionTime":"2026-02-16T19:42:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:59 crc kubenswrapper[4675]: I0216 19:42:59.006569 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:59 crc kubenswrapper[4675]: I0216 19:42:59.007152 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:59 crc kubenswrapper[4675]: I0216 19:42:59.007330 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:59 crc kubenswrapper[4675]: I0216 19:42:59.007501 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:59 crc kubenswrapper[4675]: I0216 19:42:59.007739 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:59Z","lastTransitionTime":"2026-02-16T19:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:59 crc kubenswrapper[4675]: I0216 19:42:59.111309 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:59 crc kubenswrapper[4675]: I0216 19:42:59.111368 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:59 crc kubenswrapper[4675]: I0216 19:42:59.111381 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:59 crc kubenswrapper[4675]: I0216 19:42:59.111400 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:59 crc kubenswrapper[4675]: I0216 19:42:59.111415 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:59Z","lastTransitionTime":"2026-02-16T19:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:59 crc kubenswrapper[4675]: I0216 19:42:59.214511 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:59 crc kubenswrapper[4675]: I0216 19:42:59.214587 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:59 crc kubenswrapper[4675]: I0216 19:42:59.214605 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:59 crc kubenswrapper[4675]: I0216 19:42:59.214636 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:59 crc kubenswrapper[4675]: I0216 19:42:59.214661 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:59Z","lastTransitionTime":"2026-02-16T19:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:59 crc kubenswrapper[4675]: I0216 19:42:59.317729 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:59 crc kubenswrapper[4675]: I0216 19:42:59.318233 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:59 crc kubenswrapper[4675]: I0216 19:42:59.318405 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:59 crc kubenswrapper[4675]: I0216 19:42:59.318577 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:59 crc kubenswrapper[4675]: I0216 19:42:59.318778 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:59Z","lastTransitionTime":"2026-02-16T19:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:59 crc kubenswrapper[4675]: I0216 19:42:59.421858 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:59 crc kubenswrapper[4675]: I0216 19:42:59.421921 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:59 crc kubenswrapper[4675]: I0216 19:42:59.421944 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:59 crc kubenswrapper[4675]: I0216 19:42:59.421974 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:59 crc kubenswrapper[4675]: I0216 19:42:59.421998 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:59Z","lastTransitionTime":"2026-02-16T19:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:59 crc kubenswrapper[4675]: I0216 19:42:59.525367 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:59 crc kubenswrapper[4675]: I0216 19:42:59.525823 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:59 crc kubenswrapper[4675]: I0216 19:42:59.525972 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:59 crc kubenswrapper[4675]: I0216 19:42:59.526156 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:59 crc kubenswrapper[4675]: I0216 19:42:59.526302 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:59Z","lastTransitionTime":"2026-02-16T19:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:59 crc kubenswrapper[4675]: I0216 19:42:59.629966 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:59 crc kubenswrapper[4675]: I0216 19:42:59.630018 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:59 crc kubenswrapper[4675]: I0216 19:42:59.630171 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:59 crc kubenswrapper[4675]: I0216 19:42:59.630194 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:59 crc kubenswrapper[4675]: I0216 19:42:59.630207 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:59Z","lastTransitionTime":"2026-02-16T19:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:59 crc kubenswrapper[4675]: I0216 19:42:59.733660 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:59 crc kubenswrapper[4675]: I0216 19:42:59.733716 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:59 crc kubenswrapper[4675]: I0216 19:42:59.733729 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:59 crc kubenswrapper[4675]: I0216 19:42:59.733748 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:59 crc kubenswrapper[4675]: I0216 19:42:59.733759 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:59Z","lastTransitionTime":"2026-02-16T19:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:59 crc kubenswrapper[4675]: I0216 19:42:59.832515 4675 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 17:16:54.080278686 +0000 UTC Feb 16 19:42:59 crc kubenswrapper[4675]: I0216 19:42:59.837283 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:59 crc kubenswrapper[4675]: I0216 19:42:59.837330 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:59 crc kubenswrapper[4675]: I0216 19:42:59.837339 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:59 crc kubenswrapper[4675]: I0216 19:42:59.837360 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:59 crc kubenswrapper[4675]: I0216 19:42:59.837370 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:59Z","lastTransitionTime":"2026-02-16T19:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:42:59 crc kubenswrapper[4675]: I0216 19:42:59.940148 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:42:59 crc kubenswrapper[4675]: I0216 19:42:59.940222 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:42:59 crc kubenswrapper[4675]: I0216 19:42:59.940232 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:42:59 crc kubenswrapper[4675]: I0216 19:42:59.940251 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:42:59 crc kubenswrapper[4675]: I0216 19:42:59.940261 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:42:59Z","lastTransitionTime":"2026-02-16T19:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:00 crc kubenswrapper[4675]: I0216 19:43:00.043818 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:00 crc kubenswrapper[4675]: I0216 19:43:00.043874 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:00 crc kubenswrapper[4675]: I0216 19:43:00.043886 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:00 crc kubenswrapper[4675]: I0216 19:43:00.043908 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:00 crc kubenswrapper[4675]: I0216 19:43:00.043925 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:00Z","lastTransitionTime":"2026-02-16T19:43:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:00 crc kubenswrapper[4675]: I0216 19:43:00.146314 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:00 crc kubenswrapper[4675]: I0216 19:43:00.146387 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:00 crc kubenswrapper[4675]: I0216 19:43:00.146403 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:00 crc kubenswrapper[4675]: I0216 19:43:00.146456 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:00 crc kubenswrapper[4675]: I0216 19:43:00.146473 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:00Z","lastTransitionTime":"2026-02-16T19:43:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:00 crc kubenswrapper[4675]: I0216 19:43:00.250022 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:00 crc kubenswrapper[4675]: I0216 19:43:00.250088 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:00 crc kubenswrapper[4675]: I0216 19:43:00.250136 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:00 crc kubenswrapper[4675]: I0216 19:43:00.250165 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:00 crc kubenswrapper[4675]: I0216 19:43:00.250182 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:00Z","lastTransitionTime":"2026-02-16T19:43:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:00 crc kubenswrapper[4675]: I0216 19:43:00.353411 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:00 crc kubenswrapper[4675]: I0216 19:43:00.353457 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:00 crc kubenswrapper[4675]: I0216 19:43:00.353466 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:00 crc kubenswrapper[4675]: I0216 19:43:00.353481 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:00 crc kubenswrapper[4675]: I0216 19:43:00.353493 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:00Z","lastTransitionTime":"2026-02-16T19:43:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:00 crc kubenswrapper[4675]: I0216 19:43:00.456511 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:00 crc kubenswrapper[4675]: I0216 19:43:00.456584 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:00 crc kubenswrapper[4675]: I0216 19:43:00.456601 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:00 crc kubenswrapper[4675]: I0216 19:43:00.456627 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:00 crc kubenswrapper[4675]: I0216 19:43:00.456646 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:00Z","lastTransitionTime":"2026-02-16T19:43:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:00 crc kubenswrapper[4675]: I0216 19:43:00.560205 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:00 crc kubenswrapper[4675]: I0216 19:43:00.560283 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:00 crc kubenswrapper[4675]: I0216 19:43:00.560302 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:00 crc kubenswrapper[4675]: I0216 19:43:00.560328 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:00 crc kubenswrapper[4675]: I0216 19:43:00.560343 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:00Z","lastTransitionTime":"2026-02-16T19:43:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:00 crc kubenswrapper[4675]: I0216 19:43:00.665678 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:00 crc kubenswrapper[4675]: I0216 19:43:00.665791 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:00 crc kubenswrapper[4675]: I0216 19:43:00.665804 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:00 crc kubenswrapper[4675]: I0216 19:43:00.665822 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:00 crc kubenswrapper[4675]: I0216 19:43:00.665858 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:00Z","lastTransitionTime":"2026-02-16T19:43:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:00 crc kubenswrapper[4675]: I0216 19:43:00.769157 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:00 crc kubenswrapper[4675]: I0216 19:43:00.769215 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:00 crc kubenswrapper[4675]: I0216 19:43:00.769229 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:00 crc kubenswrapper[4675]: I0216 19:43:00.769251 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:00 crc kubenswrapper[4675]: I0216 19:43:00.769268 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:00Z","lastTransitionTime":"2026-02-16T19:43:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:00 crc kubenswrapper[4675]: I0216 19:43:00.833093 4675 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 03:25:10.297299529 +0000 UTC Feb 16 19:43:00 crc kubenswrapper[4675]: I0216 19:43:00.872306 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:00 crc kubenswrapper[4675]: I0216 19:43:00.872364 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:00 crc kubenswrapper[4675]: I0216 19:43:00.872379 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:00 crc kubenswrapper[4675]: I0216 19:43:00.872399 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:00 crc kubenswrapper[4675]: I0216 19:43:00.872785 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:00Z","lastTransitionTime":"2026-02-16T19:43:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:00 crc kubenswrapper[4675]: I0216 19:43:00.883603 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 19:43:00 crc kubenswrapper[4675]: I0216 19:43:00.883761 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbgjb" Feb 16 19:43:00 crc kubenswrapper[4675]: I0216 19:43:00.883782 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 19:43:00 crc kubenswrapper[4675]: E0216 19:43:00.883888 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 19:43:00 crc kubenswrapper[4675]: I0216 19:43:00.884017 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 19:43:00 crc kubenswrapper[4675]: E0216 19:43:00.884024 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbgjb" podUID="8d5a5b47-38a4-4f7e-b40e-dba4825e18be" Feb 16 19:43:00 crc kubenswrapper[4675]: E0216 19:43:00.884154 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 19:43:00 crc kubenswrapper[4675]: E0216 19:43:00.884427 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 19:43:00 crc kubenswrapper[4675]: I0216 19:43:00.977885 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:00 crc kubenswrapper[4675]: I0216 19:43:00.978457 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:00 crc kubenswrapper[4675]: I0216 19:43:00.978472 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:00 crc kubenswrapper[4675]: I0216 19:43:00.978498 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:00 crc kubenswrapper[4675]: I0216 19:43:00.978515 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:00Z","lastTransitionTime":"2026-02-16T19:43:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:01 crc kubenswrapper[4675]: I0216 19:43:01.082139 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:01 crc kubenswrapper[4675]: I0216 19:43:01.082232 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:01 crc kubenswrapper[4675]: I0216 19:43:01.082263 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:01 crc kubenswrapper[4675]: I0216 19:43:01.082296 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:01 crc kubenswrapper[4675]: I0216 19:43:01.082326 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:01Z","lastTransitionTime":"2026-02-16T19:43:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:01 crc kubenswrapper[4675]: I0216 19:43:01.186143 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:01 crc kubenswrapper[4675]: I0216 19:43:01.186205 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:01 crc kubenswrapper[4675]: I0216 19:43:01.186221 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:01 crc kubenswrapper[4675]: I0216 19:43:01.186252 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:01 crc kubenswrapper[4675]: I0216 19:43:01.186271 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:01Z","lastTransitionTime":"2026-02-16T19:43:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:01 crc kubenswrapper[4675]: I0216 19:43:01.290215 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:01 crc kubenswrapper[4675]: I0216 19:43:01.290267 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:01 crc kubenswrapper[4675]: I0216 19:43:01.290277 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:01 crc kubenswrapper[4675]: I0216 19:43:01.290295 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:01 crc kubenswrapper[4675]: I0216 19:43:01.290306 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:01Z","lastTransitionTime":"2026-02-16T19:43:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:01 crc kubenswrapper[4675]: I0216 19:43:01.393131 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:01 crc kubenswrapper[4675]: I0216 19:43:01.393200 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:01 crc kubenswrapper[4675]: I0216 19:43:01.393219 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:01 crc kubenswrapper[4675]: I0216 19:43:01.393247 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:01 crc kubenswrapper[4675]: I0216 19:43:01.393265 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:01Z","lastTransitionTime":"2026-02-16T19:43:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:01 crc kubenswrapper[4675]: I0216 19:43:01.496859 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:01 crc kubenswrapper[4675]: I0216 19:43:01.496933 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:01 crc kubenswrapper[4675]: I0216 19:43:01.496952 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:01 crc kubenswrapper[4675]: I0216 19:43:01.496982 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:01 crc kubenswrapper[4675]: I0216 19:43:01.497005 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:01Z","lastTransitionTime":"2026-02-16T19:43:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:01 crc kubenswrapper[4675]: I0216 19:43:01.600464 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:01 crc kubenswrapper[4675]: I0216 19:43:01.600558 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:01 crc kubenswrapper[4675]: I0216 19:43:01.600577 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:01 crc kubenswrapper[4675]: I0216 19:43:01.600608 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:01 crc kubenswrapper[4675]: I0216 19:43:01.600628 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:01Z","lastTransitionTime":"2026-02-16T19:43:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:01 crc kubenswrapper[4675]: I0216 19:43:01.703464 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:01 crc kubenswrapper[4675]: I0216 19:43:01.703515 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:01 crc kubenswrapper[4675]: I0216 19:43:01.703527 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:01 crc kubenswrapper[4675]: I0216 19:43:01.703546 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:01 crc kubenswrapper[4675]: I0216 19:43:01.703560 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:01Z","lastTransitionTime":"2026-02-16T19:43:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:01 crc kubenswrapper[4675]: I0216 19:43:01.806422 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:01 crc kubenswrapper[4675]: I0216 19:43:01.806467 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:01 crc kubenswrapper[4675]: I0216 19:43:01.806477 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:01 crc kubenswrapper[4675]: I0216 19:43:01.806496 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:01 crc kubenswrapper[4675]: I0216 19:43:01.806508 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:01Z","lastTransitionTime":"2026-02-16T19:43:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:01 crc kubenswrapper[4675]: I0216 19:43:01.834105 4675 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-11 23:25:21.531287607 +0000 UTC Feb 16 19:43:01 crc kubenswrapper[4675]: I0216 19:43:01.910324 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:01 crc kubenswrapper[4675]: I0216 19:43:01.910370 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:01 crc kubenswrapper[4675]: I0216 19:43:01.910387 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:01 crc kubenswrapper[4675]: I0216 19:43:01.910410 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:01 crc kubenswrapper[4675]: I0216 19:43:01.910428 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:01Z","lastTransitionTime":"2026-02-16T19:43:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:02 crc kubenswrapper[4675]: I0216 19:43:02.013961 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:02 crc kubenswrapper[4675]: I0216 19:43:02.014009 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:02 crc kubenswrapper[4675]: I0216 19:43:02.014020 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:02 crc kubenswrapper[4675]: I0216 19:43:02.014040 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:02 crc kubenswrapper[4675]: I0216 19:43:02.014053 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:02Z","lastTransitionTime":"2026-02-16T19:43:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:02 crc kubenswrapper[4675]: I0216 19:43:02.116434 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:02 crc kubenswrapper[4675]: I0216 19:43:02.116477 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:02 crc kubenswrapper[4675]: I0216 19:43:02.116488 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:02 crc kubenswrapper[4675]: I0216 19:43:02.116503 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:02 crc kubenswrapper[4675]: I0216 19:43:02.116515 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:02Z","lastTransitionTime":"2026-02-16T19:43:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:02 crc kubenswrapper[4675]: I0216 19:43:02.219800 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:02 crc kubenswrapper[4675]: I0216 19:43:02.219872 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:02 crc kubenswrapper[4675]: I0216 19:43:02.219890 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:02 crc kubenswrapper[4675]: I0216 19:43:02.219925 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:02 crc kubenswrapper[4675]: I0216 19:43:02.219949 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:02Z","lastTransitionTime":"2026-02-16T19:43:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:02 crc kubenswrapper[4675]: I0216 19:43:02.323173 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:02 crc kubenswrapper[4675]: I0216 19:43:02.323257 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:02 crc kubenswrapper[4675]: I0216 19:43:02.323272 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:02 crc kubenswrapper[4675]: I0216 19:43:02.323292 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:02 crc kubenswrapper[4675]: I0216 19:43:02.323305 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:02Z","lastTransitionTime":"2026-02-16T19:43:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:02 crc kubenswrapper[4675]: I0216 19:43:02.427420 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:02 crc kubenswrapper[4675]: I0216 19:43:02.427488 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:02 crc kubenswrapper[4675]: I0216 19:43:02.427505 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:02 crc kubenswrapper[4675]: I0216 19:43:02.427530 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:02 crc kubenswrapper[4675]: I0216 19:43:02.427547 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:02Z","lastTransitionTime":"2026-02-16T19:43:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:02 crc kubenswrapper[4675]: I0216 19:43:02.530909 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:02 crc kubenswrapper[4675]: I0216 19:43:02.530955 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:02 crc kubenswrapper[4675]: I0216 19:43:02.530965 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:02 crc kubenswrapper[4675]: I0216 19:43:02.530983 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:02 crc kubenswrapper[4675]: I0216 19:43:02.530994 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:02Z","lastTransitionTime":"2026-02-16T19:43:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:02 crc kubenswrapper[4675]: I0216 19:43:02.633635 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:02 crc kubenswrapper[4675]: I0216 19:43:02.633677 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:02 crc kubenswrapper[4675]: I0216 19:43:02.633703 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:02 crc kubenswrapper[4675]: I0216 19:43:02.633719 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:02 crc kubenswrapper[4675]: I0216 19:43:02.633730 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:02Z","lastTransitionTime":"2026-02-16T19:43:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:02 crc kubenswrapper[4675]: I0216 19:43:02.736737 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:02 crc kubenswrapper[4675]: I0216 19:43:02.736798 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:02 crc kubenswrapper[4675]: I0216 19:43:02.736811 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:02 crc kubenswrapper[4675]: I0216 19:43:02.736831 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:02 crc kubenswrapper[4675]: I0216 19:43:02.736845 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:02Z","lastTransitionTime":"2026-02-16T19:43:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:02 crc kubenswrapper[4675]: I0216 19:43:02.834582 4675 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 03:47:53.359771092 +0000 UTC Feb 16 19:43:02 crc kubenswrapper[4675]: I0216 19:43:02.839779 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:02 crc kubenswrapper[4675]: I0216 19:43:02.839882 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:02 crc kubenswrapper[4675]: I0216 19:43:02.839905 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:02 crc kubenswrapper[4675]: I0216 19:43:02.839926 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:02 crc kubenswrapper[4675]: I0216 19:43:02.839939 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:02Z","lastTransitionTime":"2026-02-16T19:43:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:02 crc kubenswrapper[4675]: I0216 19:43:02.883515 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 19:43:02 crc kubenswrapper[4675]: I0216 19:43:02.883561 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 19:43:02 crc kubenswrapper[4675]: I0216 19:43:02.883602 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbgjb" Feb 16 19:43:02 crc kubenswrapper[4675]: I0216 19:43:02.883775 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 19:43:02 crc kubenswrapper[4675]: E0216 19:43:02.883772 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 19:43:02 crc kubenswrapper[4675]: E0216 19:43:02.883875 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 19:43:02 crc kubenswrapper[4675]: E0216 19:43:02.883994 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 19:43:02 crc kubenswrapper[4675]: E0216 19:43:02.884075 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbgjb" podUID="8d5a5b47-38a4-4f7e-b40e-dba4825e18be" Feb 16 19:43:02 crc kubenswrapper[4675]: I0216 19:43:02.941978 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:02 crc kubenswrapper[4675]: I0216 19:43:02.942075 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:02 crc kubenswrapper[4675]: I0216 19:43:02.942099 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:02 crc kubenswrapper[4675]: I0216 19:43:02.942135 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:02 crc kubenswrapper[4675]: I0216 19:43:02.942159 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:02Z","lastTransitionTime":"2026-02-16T19:43:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:03 crc kubenswrapper[4675]: I0216 19:43:03.045026 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:03 crc kubenswrapper[4675]: I0216 19:43:03.045094 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:03 crc kubenswrapper[4675]: I0216 19:43:03.045113 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:03 crc kubenswrapper[4675]: I0216 19:43:03.045138 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:03 crc kubenswrapper[4675]: I0216 19:43:03.045155 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:03Z","lastTransitionTime":"2026-02-16T19:43:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:03 crc kubenswrapper[4675]: I0216 19:43:03.148904 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:03 crc kubenswrapper[4675]: I0216 19:43:03.149232 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:03 crc kubenswrapper[4675]: I0216 19:43:03.149327 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:03 crc kubenswrapper[4675]: I0216 19:43:03.149361 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:03 crc kubenswrapper[4675]: I0216 19:43:03.149390 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:03Z","lastTransitionTime":"2026-02-16T19:43:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:03 crc kubenswrapper[4675]: I0216 19:43:03.253151 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:03 crc kubenswrapper[4675]: I0216 19:43:03.253193 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:03 crc kubenswrapper[4675]: I0216 19:43:03.253206 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:03 crc kubenswrapper[4675]: I0216 19:43:03.253226 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:03 crc kubenswrapper[4675]: I0216 19:43:03.253240 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:03Z","lastTransitionTime":"2026-02-16T19:43:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:03 crc kubenswrapper[4675]: I0216 19:43:03.356394 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:03 crc kubenswrapper[4675]: I0216 19:43:03.356442 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:03 crc kubenswrapper[4675]: I0216 19:43:03.356451 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:03 crc kubenswrapper[4675]: I0216 19:43:03.356469 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:03 crc kubenswrapper[4675]: I0216 19:43:03.356480 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:03Z","lastTransitionTime":"2026-02-16T19:43:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:03 crc kubenswrapper[4675]: I0216 19:43:03.458711 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:03 crc kubenswrapper[4675]: I0216 19:43:03.458748 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:03 crc kubenswrapper[4675]: I0216 19:43:03.458756 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:03 crc kubenswrapper[4675]: I0216 19:43:03.458771 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:03 crc kubenswrapper[4675]: I0216 19:43:03.458782 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:03Z","lastTransitionTime":"2026-02-16T19:43:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:03 crc kubenswrapper[4675]: I0216 19:43:03.561422 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:03 crc kubenswrapper[4675]: I0216 19:43:03.561474 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:03 crc kubenswrapper[4675]: I0216 19:43:03.561491 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:03 crc kubenswrapper[4675]: I0216 19:43:03.561511 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:03 crc kubenswrapper[4675]: I0216 19:43:03.561523 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:03Z","lastTransitionTime":"2026-02-16T19:43:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:03 crc kubenswrapper[4675]: I0216 19:43:03.663850 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:03 crc kubenswrapper[4675]: I0216 19:43:03.663909 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:03 crc kubenswrapper[4675]: I0216 19:43:03.663924 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:03 crc kubenswrapper[4675]: I0216 19:43:03.663951 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:03 crc kubenswrapper[4675]: I0216 19:43:03.663965 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:03Z","lastTransitionTime":"2026-02-16T19:43:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:03 crc kubenswrapper[4675]: I0216 19:43:03.767133 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:03 crc kubenswrapper[4675]: I0216 19:43:03.767187 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:03 crc kubenswrapper[4675]: I0216 19:43:03.767200 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:03 crc kubenswrapper[4675]: I0216 19:43:03.767223 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:03 crc kubenswrapper[4675]: I0216 19:43:03.767238 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:03Z","lastTransitionTime":"2026-02-16T19:43:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:03 crc kubenswrapper[4675]: I0216 19:43:03.835596 4675 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 19:45:18.486909044 +0000 UTC Feb 16 19:43:03 crc kubenswrapper[4675]: I0216 19:43:03.870216 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:03 crc kubenswrapper[4675]: I0216 19:43:03.870302 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:03 crc kubenswrapper[4675]: I0216 19:43:03.870360 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:03 crc kubenswrapper[4675]: I0216 19:43:03.870398 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:03 crc kubenswrapper[4675]: I0216 19:43:03.870423 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:03Z","lastTransitionTime":"2026-02-16T19:43:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:03 crc kubenswrapper[4675]: I0216 19:43:03.974092 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:03 crc kubenswrapper[4675]: I0216 19:43:03.974139 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:03 crc kubenswrapper[4675]: I0216 19:43:03.974151 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:03 crc kubenswrapper[4675]: I0216 19:43:03.974171 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:03 crc kubenswrapper[4675]: I0216 19:43:03.974184 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:03Z","lastTransitionTime":"2026-02-16T19:43:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:04 crc kubenswrapper[4675]: I0216 19:43:04.077063 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:04 crc kubenswrapper[4675]: I0216 19:43:04.077113 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:04 crc kubenswrapper[4675]: I0216 19:43:04.077123 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:04 crc kubenswrapper[4675]: I0216 19:43:04.077143 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:04 crc kubenswrapper[4675]: I0216 19:43:04.077158 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:04Z","lastTransitionTime":"2026-02-16T19:43:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:04 crc kubenswrapper[4675]: I0216 19:43:04.180611 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:04 crc kubenswrapper[4675]: I0216 19:43:04.180668 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:04 crc kubenswrapper[4675]: I0216 19:43:04.180701 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:04 crc kubenswrapper[4675]: I0216 19:43:04.180722 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:04 crc kubenswrapper[4675]: I0216 19:43:04.180734 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:04Z","lastTransitionTime":"2026-02-16T19:43:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:04 crc kubenswrapper[4675]: I0216 19:43:04.283874 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:04 crc kubenswrapper[4675]: I0216 19:43:04.283922 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:04 crc kubenswrapper[4675]: I0216 19:43:04.283938 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:04 crc kubenswrapper[4675]: I0216 19:43:04.283961 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:04 crc kubenswrapper[4675]: I0216 19:43:04.283971 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:04Z","lastTransitionTime":"2026-02-16T19:43:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:04 crc kubenswrapper[4675]: I0216 19:43:04.387174 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:04 crc kubenswrapper[4675]: I0216 19:43:04.387216 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:04 crc kubenswrapper[4675]: I0216 19:43:04.387227 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:04 crc kubenswrapper[4675]: I0216 19:43:04.387244 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:04 crc kubenswrapper[4675]: I0216 19:43:04.387254 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:04Z","lastTransitionTime":"2026-02-16T19:43:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:04 crc kubenswrapper[4675]: I0216 19:43:04.491221 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:04 crc kubenswrapper[4675]: I0216 19:43:04.491307 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:04 crc kubenswrapper[4675]: I0216 19:43:04.491320 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:04 crc kubenswrapper[4675]: I0216 19:43:04.491341 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:04 crc kubenswrapper[4675]: I0216 19:43:04.491352 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:04Z","lastTransitionTime":"2026-02-16T19:43:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:04 crc kubenswrapper[4675]: I0216 19:43:04.594012 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:04 crc kubenswrapper[4675]: I0216 19:43:04.594056 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:04 crc kubenswrapper[4675]: I0216 19:43:04.594066 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:04 crc kubenswrapper[4675]: I0216 19:43:04.594082 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:04 crc kubenswrapper[4675]: I0216 19:43:04.594092 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:04Z","lastTransitionTime":"2026-02-16T19:43:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:04 crc kubenswrapper[4675]: I0216 19:43:04.697403 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:04 crc kubenswrapper[4675]: I0216 19:43:04.697466 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:04 crc kubenswrapper[4675]: I0216 19:43:04.697481 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:04 crc kubenswrapper[4675]: I0216 19:43:04.697511 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:04 crc kubenswrapper[4675]: I0216 19:43:04.697526 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:04Z","lastTransitionTime":"2026-02-16T19:43:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:04 crc kubenswrapper[4675]: I0216 19:43:04.800107 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:04 crc kubenswrapper[4675]: I0216 19:43:04.800175 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:04 crc kubenswrapper[4675]: I0216 19:43:04.800222 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:04 crc kubenswrapper[4675]: I0216 19:43:04.800255 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:04 crc kubenswrapper[4675]: I0216 19:43:04.800272 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:04Z","lastTransitionTime":"2026-02-16T19:43:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:04 crc kubenswrapper[4675]: I0216 19:43:04.836636 4675 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 17:10:49.452969209 +0000 UTC Feb 16 19:43:04 crc kubenswrapper[4675]: I0216 19:43:04.883881 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbgjb" Feb 16 19:43:04 crc kubenswrapper[4675]: I0216 19:43:04.883937 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 19:43:04 crc kubenswrapper[4675]: I0216 19:43:04.884025 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 19:43:04 crc kubenswrapper[4675]: I0216 19:43:04.883881 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 19:43:04 crc kubenswrapper[4675]: E0216 19:43:04.884081 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbgjb" podUID="8d5a5b47-38a4-4f7e-b40e-dba4825e18be" Feb 16 19:43:04 crc kubenswrapper[4675]: E0216 19:43:04.884232 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 19:43:04 crc kubenswrapper[4675]: E0216 19:43:04.884313 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 19:43:04 crc kubenswrapper[4675]: E0216 19:43:04.884401 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 19:43:04 crc kubenswrapper[4675]: I0216 19:43:04.903034 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:04 crc kubenswrapper[4675]: I0216 19:43:04.903092 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:04 crc kubenswrapper[4675]: I0216 19:43:04.903105 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:04 crc kubenswrapper[4675]: I0216 19:43:04.903129 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:04 crc kubenswrapper[4675]: I0216 19:43:04.903143 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:04Z","lastTransitionTime":"2026-02-16T19:43:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:05 crc kubenswrapper[4675]: I0216 19:43:05.006115 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:05 crc kubenswrapper[4675]: I0216 19:43:05.006187 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:05 crc kubenswrapper[4675]: I0216 19:43:05.006201 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:05 crc kubenswrapper[4675]: I0216 19:43:05.006223 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:05 crc kubenswrapper[4675]: I0216 19:43:05.006237 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:05Z","lastTransitionTime":"2026-02-16T19:43:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:05 crc kubenswrapper[4675]: I0216 19:43:05.109347 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:05 crc kubenswrapper[4675]: I0216 19:43:05.109406 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:05 crc kubenswrapper[4675]: I0216 19:43:05.109418 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:05 crc kubenswrapper[4675]: I0216 19:43:05.109440 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:05 crc kubenswrapper[4675]: I0216 19:43:05.109456 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:05Z","lastTransitionTime":"2026-02-16T19:43:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:05 crc kubenswrapper[4675]: I0216 19:43:05.212378 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:05 crc kubenswrapper[4675]: I0216 19:43:05.212430 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:05 crc kubenswrapper[4675]: I0216 19:43:05.212439 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:05 crc kubenswrapper[4675]: I0216 19:43:05.212460 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:05 crc kubenswrapper[4675]: I0216 19:43:05.212472 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:05Z","lastTransitionTime":"2026-02-16T19:43:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:05 crc kubenswrapper[4675]: I0216 19:43:05.315194 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:05 crc kubenswrapper[4675]: I0216 19:43:05.315247 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:05 crc kubenswrapper[4675]: I0216 19:43:05.315266 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:05 crc kubenswrapper[4675]: I0216 19:43:05.315288 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:05 crc kubenswrapper[4675]: I0216 19:43:05.315303 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:05Z","lastTransitionTime":"2026-02-16T19:43:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:05 crc kubenswrapper[4675]: I0216 19:43:05.418041 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:05 crc kubenswrapper[4675]: I0216 19:43:05.418104 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:05 crc kubenswrapper[4675]: I0216 19:43:05.418124 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:05 crc kubenswrapper[4675]: I0216 19:43:05.418147 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:05 crc kubenswrapper[4675]: I0216 19:43:05.418161 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:05Z","lastTransitionTime":"2026-02-16T19:43:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:05 crc kubenswrapper[4675]: I0216 19:43:05.520733 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:05 crc kubenswrapper[4675]: I0216 19:43:05.520799 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:05 crc kubenswrapper[4675]: I0216 19:43:05.520809 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:05 crc kubenswrapper[4675]: I0216 19:43:05.520827 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:05 crc kubenswrapper[4675]: I0216 19:43:05.520838 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:05Z","lastTransitionTime":"2026-02-16T19:43:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:05 crc kubenswrapper[4675]: I0216 19:43:05.623726 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:05 crc kubenswrapper[4675]: I0216 19:43:05.623765 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:05 crc kubenswrapper[4675]: I0216 19:43:05.623777 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:05 crc kubenswrapper[4675]: I0216 19:43:05.623795 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:05 crc kubenswrapper[4675]: I0216 19:43:05.623808 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:05Z","lastTransitionTime":"2026-02-16T19:43:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:05 crc kubenswrapper[4675]: I0216 19:43:05.727002 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:05 crc kubenswrapper[4675]: I0216 19:43:05.727057 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:05 crc kubenswrapper[4675]: I0216 19:43:05.727070 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:05 crc kubenswrapper[4675]: I0216 19:43:05.727089 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:05 crc kubenswrapper[4675]: I0216 19:43:05.727104 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:05Z","lastTransitionTime":"2026-02-16T19:43:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:05 crc kubenswrapper[4675]: I0216 19:43:05.830481 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:05 crc kubenswrapper[4675]: I0216 19:43:05.830527 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:05 crc kubenswrapper[4675]: I0216 19:43:05.830535 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:05 crc kubenswrapper[4675]: I0216 19:43:05.830554 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:05 crc kubenswrapper[4675]: I0216 19:43:05.830565 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:05Z","lastTransitionTime":"2026-02-16T19:43:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:05 crc kubenswrapper[4675]: I0216 19:43:05.837668 4675 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 18:26:10.562561908 +0000 UTC Feb 16 19:43:05 crc kubenswrapper[4675]: I0216 19:43:05.933240 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:05 crc kubenswrapper[4675]: I0216 19:43:05.933286 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:05 crc kubenswrapper[4675]: I0216 19:43:05.933299 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:05 crc kubenswrapper[4675]: I0216 19:43:05.933318 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:05 crc kubenswrapper[4675]: I0216 19:43:05.933331 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:05Z","lastTransitionTime":"2026-02-16T19:43:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:06 crc kubenswrapper[4675]: I0216 19:43:06.036619 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:06 crc kubenswrapper[4675]: I0216 19:43:06.036671 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:06 crc kubenswrapper[4675]: I0216 19:43:06.036706 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:06 crc kubenswrapper[4675]: I0216 19:43:06.036728 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:06 crc kubenswrapper[4675]: I0216 19:43:06.036739 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:06Z","lastTransitionTime":"2026-02-16T19:43:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:06 crc kubenswrapper[4675]: I0216 19:43:06.140041 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:06 crc kubenswrapper[4675]: I0216 19:43:06.140101 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:06 crc kubenswrapper[4675]: I0216 19:43:06.140120 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:06 crc kubenswrapper[4675]: I0216 19:43:06.140148 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:06 crc kubenswrapper[4675]: I0216 19:43:06.140168 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:06Z","lastTransitionTime":"2026-02-16T19:43:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:06 crc kubenswrapper[4675]: I0216 19:43:06.243115 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:06 crc kubenswrapper[4675]: I0216 19:43:06.243160 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:06 crc kubenswrapper[4675]: I0216 19:43:06.243170 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:06 crc kubenswrapper[4675]: I0216 19:43:06.243188 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:06 crc kubenswrapper[4675]: I0216 19:43:06.243202 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:06Z","lastTransitionTime":"2026-02-16T19:43:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:06 crc kubenswrapper[4675]: I0216 19:43:06.270019 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:06 crc kubenswrapper[4675]: I0216 19:43:06.270081 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:06 crc kubenswrapper[4675]: I0216 19:43:06.270093 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:06 crc kubenswrapper[4675]: I0216 19:43:06.270112 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:06 crc kubenswrapper[4675]: I0216 19:43:06.270122 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:06Z","lastTransitionTime":"2026-02-16T19:43:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:06 crc kubenswrapper[4675]: E0216 19:43:06.288806 4675 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:43:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:43:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:43:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:43:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:43:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:43:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:43:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:43:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dc074e08-7429-4b93-941d-3d8335487f37\\\",\\\"systemUUID\\\":\\\"a98cec76-f68c-4af0-9517-b326e9616347\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:06Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:06 crc kubenswrapper[4675]: I0216 19:43:06.294298 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:06 crc kubenswrapper[4675]: I0216 19:43:06.294354 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:06 crc kubenswrapper[4675]: I0216 19:43:06.294369 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:06 crc kubenswrapper[4675]: I0216 19:43:06.294391 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:06 crc kubenswrapper[4675]: I0216 19:43:06.294406 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:06Z","lastTransitionTime":"2026-02-16T19:43:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:06 crc kubenswrapper[4675]: E0216 19:43:06.312080 4675 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:43:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:43:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:43:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:43:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:43:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:43:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:43:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:43:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dc074e08-7429-4b93-941d-3d8335487f37\\\",\\\"systemUUID\\\":\\\"a98cec76-f68c-4af0-9517-b326e9616347\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:06Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:06 crc kubenswrapper[4675]: I0216 19:43:06.320554 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:06 crc kubenswrapper[4675]: I0216 19:43:06.320675 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:06 crc kubenswrapper[4675]: I0216 19:43:06.320721 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:06 crc kubenswrapper[4675]: I0216 19:43:06.320828 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:06 crc kubenswrapper[4675]: I0216 19:43:06.320915 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:06Z","lastTransitionTime":"2026-02-16T19:43:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:06 crc kubenswrapper[4675]: E0216 19:43:06.339627 4675 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:43:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:43:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:43:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:43:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:43:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:43:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:43:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:43:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dc074e08-7429-4b93-941d-3d8335487f37\\\",\\\"systemUUID\\\":\\\"a98cec76-f68c-4af0-9517-b326e9616347\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:06Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:06 crc kubenswrapper[4675]: I0216 19:43:06.346185 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:06 crc kubenswrapper[4675]: I0216 19:43:06.346456 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:06 crc kubenswrapper[4675]: I0216 19:43:06.346602 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:06 crc kubenswrapper[4675]: I0216 19:43:06.346769 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:06 crc kubenswrapper[4675]: I0216 19:43:06.346934 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:06Z","lastTransitionTime":"2026-02-16T19:43:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:06 crc kubenswrapper[4675]: E0216 19:43:06.362822 4675 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:43:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:43:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:43:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:43:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:43:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:43:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:43:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:43:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dc074e08-7429-4b93-941d-3d8335487f37\\\",\\\"systemUUID\\\":\\\"a98cec76-f68c-4af0-9517-b326e9616347\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:06Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:06 crc kubenswrapper[4675]: I0216 19:43:06.368263 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:06 crc kubenswrapper[4675]: I0216 19:43:06.368310 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:06 crc kubenswrapper[4675]: I0216 19:43:06.368337 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:06 crc kubenswrapper[4675]: I0216 19:43:06.368359 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:06 crc kubenswrapper[4675]: I0216 19:43:06.368372 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:06Z","lastTransitionTime":"2026-02-16T19:43:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:06 crc kubenswrapper[4675]: E0216 19:43:06.381193 4675 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:43:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:43:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:43:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:43:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:43:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:43:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:43:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:43:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dc074e08-7429-4b93-941d-3d8335487f37\\\",\\\"systemUUID\\\":\\\"a98cec76-f68c-4af0-9517-b326e9616347\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:06Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:06 crc kubenswrapper[4675]: E0216 19:43:06.381315 4675 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 16 19:43:06 crc kubenswrapper[4675]: I0216 19:43:06.383180 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:06 crc kubenswrapper[4675]: I0216 19:43:06.383231 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:06 crc kubenswrapper[4675]: I0216 19:43:06.383253 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:06 crc kubenswrapper[4675]: I0216 19:43:06.383300 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:06 crc kubenswrapper[4675]: I0216 19:43:06.383317 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:06Z","lastTransitionTime":"2026-02-16T19:43:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:06 crc kubenswrapper[4675]: I0216 19:43:06.486440 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:06 crc kubenswrapper[4675]: I0216 19:43:06.486502 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:06 crc kubenswrapper[4675]: I0216 19:43:06.486520 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:06 crc kubenswrapper[4675]: I0216 19:43:06.486550 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:06 crc kubenswrapper[4675]: I0216 19:43:06.486569 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:06Z","lastTransitionTime":"2026-02-16T19:43:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:06 crc kubenswrapper[4675]: I0216 19:43:06.589623 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:06 crc kubenswrapper[4675]: I0216 19:43:06.589669 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:06 crc kubenswrapper[4675]: I0216 19:43:06.589680 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:06 crc kubenswrapper[4675]: I0216 19:43:06.589714 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:06 crc kubenswrapper[4675]: I0216 19:43:06.589725 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:06Z","lastTransitionTime":"2026-02-16T19:43:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:06 crc kubenswrapper[4675]: I0216 19:43:06.692528 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:06 crc kubenswrapper[4675]: I0216 19:43:06.692569 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:06 crc kubenswrapper[4675]: I0216 19:43:06.692579 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:06 crc kubenswrapper[4675]: I0216 19:43:06.692599 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:06 crc kubenswrapper[4675]: I0216 19:43:06.692611 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:06Z","lastTransitionTime":"2026-02-16T19:43:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:06 crc kubenswrapper[4675]: I0216 19:43:06.795480 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:06 crc kubenswrapper[4675]: I0216 19:43:06.795520 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:06 crc kubenswrapper[4675]: I0216 19:43:06.795530 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:06 crc kubenswrapper[4675]: I0216 19:43:06.795546 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:06 crc kubenswrapper[4675]: I0216 19:43:06.795557 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:06Z","lastTransitionTime":"2026-02-16T19:43:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:06 crc kubenswrapper[4675]: I0216 19:43:06.837999 4675 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 19:12:44.204061971 +0000 UTC Feb 16 19:43:06 crc kubenswrapper[4675]: I0216 19:43:06.883673 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 19:43:06 crc kubenswrapper[4675]: I0216 19:43:06.883754 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 19:43:06 crc kubenswrapper[4675]: E0216 19:43:06.883852 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 19:43:06 crc kubenswrapper[4675]: I0216 19:43:06.883784 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 19:43:06 crc kubenswrapper[4675]: E0216 19:43:06.883990 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 19:43:06 crc kubenswrapper[4675]: I0216 19:43:06.884077 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbgjb" Feb 16 19:43:06 crc kubenswrapper[4675]: E0216 19:43:06.884173 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 19:43:06 crc kubenswrapper[4675]: E0216 19:43:06.884276 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbgjb" podUID="8d5a5b47-38a4-4f7e-b40e-dba4825e18be" Feb 16 19:43:06 crc kubenswrapper[4675]: I0216 19:43:06.898187 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:06 crc kubenswrapper[4675]: I0216 19:43:06.898216 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:06 crc kubenswrapper[4675]: I0216 19:43:06.898225 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:06 crc kubenswrapper[4675]: I0216 19:43:06.898241 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:06 crc kubenswrapper[4675]: I0216 19:43:06.898250 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:06Z","lastTransitionTime":"2026-02-16T19:43:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:07 crc kubenswrapper[4675]: I0216 19:43:07.001522 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:07 crc kubenswrapper[4675]: I0216 19:43:07.001555 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:07 crc kubenswrapper[4675]: I0216 19:43:07.001566 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:07 crc kubenswrapper[4675]: I0216 19:43:07.001584 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:07 crc kubenswrapper[4675]: I0216 19:43:07.001596 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:07Z","lastTransitionTime":"2026-02-16T19:43:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:07 crc kubenswrapper[4675]: I0216 19:43:07.105012 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:07 crc kubenswrapper[4675]: I0216 19:43:07.105077 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:07 crc kubenswrapper[4675]: I0216 19:43:07.105092 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:07 crc kubenswrapper[4675]: I0216 19:43:07.105117 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:07 crc kubenswrapper[4675]: I0216 19:43:07.105134 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:07Z","lastTransitionTime":"2026-02-16T19:43:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:07 crc kubenswrapper[4675]: I0216 19:43:07.110627 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8d5a5b47-38a4-4f7e-b40e-dba4825e18be-metrics-certs\") pod \"network-metrics-daemon-sbgjb\" (UID: \"8d5a5b47-38a4-4f7e-b40e-dba4825e18be\") " pod="openshift-multus/network-metrics-daemon-sbgjb" Feb 16 19:43:07 crc kubenswrapper[4675]: E0216 19:43:07.110832 4675 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 16 19:43:07 crc kubenswrapper[4675]: E0216 19:43:07.110928 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8d5a5b47-38a4-4f7e-b40e-dba4825e18be-metrics-certs podName:8d5a5b47-38a4-4f7e-b40e-dba4825e18be nodeName:}" failed. No retries permitted until 2026-02-16 19:43:39.110904793 +0000 UTC m=+102.236194359 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8d5a5b47-38a4-4f7e-b40e-dba4825e18be-metrics-certs") pod "network-metrics-daemon-sbgjb" (UID: "8d5a5b47-38a4-4f7e-b40e-dba4825e18be") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 16 19:43:07 crc kubenswrapper[4675]: I0216 19:43:07.208275 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:07 crc kubenswrapper[4675]: I0216 19:43:07.208350 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:07 crc kubenswrapper[4675]: I0216 19:43:07.208369 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:07 crc kubenswrapper[4675]: I0216 19:43:07.208398 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:07 crc kubenswrapper[4675]: I0216 19:43:07.208415 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:07Z","lastTransitionTime":"2026-02-16T19:43:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:07 crc kubenswrapper[4675]: I0216 19:43:07.311262 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:07 crc kubenswrapper[4675]: I0216 19:43:07.311315 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:07 crc kubenswrapper[4675]: I0216 19:43:07.311329 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:07 crc kubenswrapper[4675]: I0216 19:43:07.311352 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:07 crc kubenswrapper[4675]: I0216 19:43:07.311366 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:07Z","lastTransitionTime":"2026-02-16T19:43:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:07 crc kubenswrapper[4675]: I0216 19:43:07.415553 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:07 crc kubenswrapper[4675]: I0216 19:43:07.415603 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:07 crc kubenswrapper[4675]: I0216 19:43:07.415619 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:07 crc kubenswrapper[4675]: I0216 19:43:07.415640 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:07 crc kubenswrapper[4675]: I0216 19:43:07.415656 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:07Z","lastTransitionTime":"2026-02-16T19:43:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:07 crc kubenswrapper[4675]: I0216 19:43:07.519094 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:07 crc kubenswrapper[4675]: I0216 19:43:07.519140 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:07 crc kubenswrapper[4675]: I0216 19:43:07.519152 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:07 crc kubenswrapper[4675]: I0216 19:43:07.519172 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:07 crc kubenswrapper[4675]: I0216 19:43:07.519184 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:07Z","lastTransitionTime":"2026-02-16T19:43:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:07 crc kubenswrapper[4675]: I0216 19:43:07.622154 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:07 crc kubenswrapper[4675]: I0216 19:43:07.622203 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:07 crc kubenswrapper[4675]: I0216 19:43:07.622217 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:07 crc kubenswrapper[4675]: I0216 19:43:07.622237 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:07 crc kubenswrapper[4675]: I0216 19:43:07.622250 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:07Z","lastTransitionTime":"2026-02-16T19:43:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:07 crc kubenswrapper[4675]: I0216 19:43:07.724809 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:07 crc kubenswrapper[4675]: I0216 19:43:07.724855 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:07 crc kubenswrapper[4675]: I0216 19:43:07.724866 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:07 crc kubenswrapper[4675]: I0216 19:43:07.724882 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:07 crc kubenswrapper[4675]: I0216 19:43:07.724893 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:07Z","lastTransitionTime":"2026-02-16T19:43:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:07 crc kubenswrapper[4675]: I0216 19:43:07.827988 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:07 crc kubenswrapper[4675]: I0216 19:43:07.828037 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:07 crc kubenswrapper[4675]: I0216 19:43:07.828049 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:07 crc kubenswrapper[4675]: I0216 19:43:07.828065 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:07 crc kubenswrapper[4675]: I0216 19:43:07.828079 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:07Z","lastTransitionTime":"2026-02-16T19:43:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:07 crc kubenswrapper[4675]: I0216 19:43:07.838344 4675 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 18:49:29.78175977 +0000 UTC Feb 16 19:43:07 crc kubenswrapper[4675]: I0216 19:43:07.885167 4675 scope.go:117] "RemoveContainer" containerID="5452aa74eda5605a3b5ad24d32c48b20ec9f2c479d5dd388e4307b79245e8f4b" Feb 16 19:43:07 crc kubenswrapper[4675]: E0216 19:43:07.885482 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-gpc5z_openshift-ovn-kubernetes(9b6e2d5a-0472-425b-b5b4-0b94f14ebfba)\"" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" podUID="9b6e2d5a-0472-425b-b5b4-0b94f14ebfba" Feb 16 19:43:07 crc kubenswrapper[4675]: I0216 19:43:07.906233 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10e2a1727464113d2cd88dfb8ba52bb1944b3dfe67a0d40f5b2e57fe483e5890\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83a4afeccd65f013b07c0f90cbb94f765f918aaaeeca716935f940deb85403d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:07Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:07 crc kubenswrapper[4675]: I0216 19:43:07.921561 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-stxk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af599569-feb0-432a-9adf-1b72d1ac1a57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5944f34bb922394197cfb79978d19139f70724115954043828e9f609ca32945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-stxk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:07Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:07 crc kubenswrapper[4675]: I0216 19:43:07.931351 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:07 crc kubenswrapper[4675]: I0216 19:43:07.931396 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:07 crc kubenswrapper[4675]: I0216 19:43:07.931408 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:07 crc kubenswrapper[4675]: I0216 19:43:07.931428 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:07 crc kubenswrapper[4675]: I0216 19:43:07.931442 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:07Z","lastTransitionTime":"2026-02-16T19:43:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:07 crc kubenswrapper[4675]: I0216 19:43:07.932901 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h9s8p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e98c1ed8-cbed-4d30-b3fc-41beff5d65c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://410f4deab4b31698f80da30412dff846ca9f1004dfab5d4cfe67c6892b3a7aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df2gr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20a2a1cc0b69b60d38a38ab898236d8577ce735cd2e1698f6fd3cb3d3c23b62c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df2gr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-h9s8p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:07Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:07 crc kubenswrapper[4675]: I0216 19:43:07.952428 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"358c3bab-4666-44d1-90fb-3affa22327e2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e0dcd3707a91b9a3869942408c144f0b4ca22cd64e99de5fe43f95094e44a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d3a4d507eb459edf9f2a3c58417ae320eba85a7b597808fdb0eabe7d7d5afbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd404f7efe7c09d47f0550b7cf66ac7193f315f60db74cd5609593546037c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bfda6b7519474256ab63750ba86955592a2e47a0e364e4170a19fe28270459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e9a715621e072260b3d9703159db5eaad4918ff7a2cb7689dcb914ccd617950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e8bfd47afcf3bbd1831b98e69a3ee5fe7b49dd04d4f2efc9d9dc8fda831c74a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e8bfd47afcf3bbd1831b98e69a3ee5fe7b49dd04d4f2efc9d9dc8fda831c74a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65752e9eeab5fea26d84c7d43a33947d3dbabdd31578480904db24d60d298a0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65752e9eeab5fea26d84c7d43a33947d3dbabdd31578480904db24d60d298a0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bd389e534c9b05c81db5cfb0a9f1f5e78e2f1ccbcce15953172836fb3fb6744f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd389e534c9b05c81db5cfb0a9f1f5e78e2f1ccbcce15953172836fb3fb6744f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:07Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:07 crc kubenswrapper[4675]: I0216 19:43:07.966151 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:07Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:07 crc kubenswrapper[4675]: I0216 19:43:07.985542 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://958d71da673d02ed4c067f390a0a94c9f20052f6274ec975f9a4c3034efa7aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5282559b57497e15c0bd828702b4994e8d05f7c1a81ee44e582ce3bb3b64cbb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13af49ba7abbe43edde46130d212f6d76f30bab85a63aef18a84721887e29535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e22795f3990f90b901e1ba026e5f95e72508dc865635d19ba881a291dbaed85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9296772284fdb3e0ac56ffd5e54f10e7fe1256ccb616d52f103479ebcc6e81f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ace6ee814e8e429b6ec77e88a0bb2fff35cfbfd3497e7a4d7882e11f437ce5a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5452aa74eda5605a3b5ad24d32c48b20ec9f2c479d5dd388e4307b79245e8f4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b63e56e590054687fedf2ed3e34d18b7fe5b675ab245cc644b18461752082d9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-16T19:42:34Z\\\",\\\"message\\\":\\\"per:true service.alpha.openshift.io/serving-cert-secret-name:config-operator-serving-cert service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc006f7818f \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 8443 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app: openshift-config-operator,},ClusterIP:10.217.4.161,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.161],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0216 19:42:34.165060 6088 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI0216 19:42:34.165067 6088 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5452aa74eda5605a3b5ad24d32c48b20ec9f2c479d5dd388e4307b79245e8f4b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-16T19:42:55Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0216 19:42:55.399413 6340 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0216 19:42:55.399442 6340 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0216 19:42:55.399468 6340 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0216 19:42:55.399515 6340 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0216 19:42:55.399545 6340 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0216 19:42:55.399552 6340 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0216 19:42:55.399802 6340 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0216 19:42:55.399864 6340 factory.go:656] Stopping watch factory\\\\nI0216 19:42:55.399912 6340 ovnkube.go:599] Stopped ovnkube\\\\nI0216 19:42:55.399629 6340 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0216 19:42:55.399637 6340 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0216 19:42:55.399654 6340 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0216 19:42:55.399993 6340 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0216 19:42:55.400004 6340 handler.go:208] Removed *v1.Node event handler 7\\\\nI0216 19:42:55.400012 6340 handler.go:208] Removed *v1.Node event handler 2\\\\nI0216 19:42:5\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4a449e69885bb506900d7dbb14ee3ff9eb2d1188267babb04fbe0bb8f057c48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gpc5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:07Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:07 crc kubenswrapper[4675]: I0216 19:43:07.997753 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10414964-83d0-4d95-a89f-e3212a8015b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db3c34dee6cb538e5de05d8703350e3013401589489756aafbf8fed2ef597d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27l4j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92e33c01bc9214841f42d4ca6e58fb062a9e94fe39bdc5ad08ad2351cd270058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27l4j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j7pnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:07Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:08 crc kubenswrapper[4675]: I0216 19:43:08.011449 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc6bac98-79c6-4970-ba8b-7996a4046f7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055f2d1052fed3e5e8d660243c3d0595e3168a670ea7a22ef51b199528ffe826\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beefddcc0b5e2e84c15063f467b9043ba9caea11fbc3f7fb8fc1ece3b8c8f75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dbd2e87a95c7964a1bafb765ec76d7f18e180cc0e47df9fde7b526b3717a107\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cc4f98152f284333d41071adaefada973e3396c3cdb2c6ee3c662f0043fe65b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c762d1facdba3325880ae6b1e2704c626ae67915808b527959795774e2a1ba62\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-16T19:42:02Z\\\",\\\"message\\\":\\\"W0216 19:42:01.228572 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0216 19:42:01.228958 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771270921 cert, and key in /tmp/serving-cert-1169484463/serving-signer.crt, /tmp/serving-cert-1169484463/serving-signer.key\\\\nI0216 19:42:01.683881 1 observer_polling.go:159] Starting file observer\\\\nW0216 19:42:01.687387 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0216 19:42:01.693158 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0216 19:42:01.694799 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1169484463/tls.crt::/tmp/serving-cert-1169484463/tls.key\\\\\\\"\\\\nF0216 19:42:02.048422 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://344f8bd7bee340cc040a2556a0e92c055ed7fbad886f85750b66d21ace766e8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:08Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:08 crc kubenswrapper[4675]: I0216 19:43:08.024547 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3824464b-e8c8-4a8a-a50b-0ba14ad23d54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1eebdfea2b981083cc06feead15d85cc403b59837948716edf1a4649aff1bfc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eea7a27c1d937fcd2e1879b3eb16aacaf9733618822a2064c003a0e69828c4e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53ef6384f8ec4cb5976952d83be543b5a555d2ebcbe603a87889791803b1ae3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb0650dde5c6086d91ae092a0550467170f105f67c5cab5b0789f3c21d0c6966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb0650dde5c6086d91ae092a0550467170f105f67c5cab5b0789f3c21d0c6966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:08Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:08 crc kubenswrapper[4675]: I0216 19:43:08.034147 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:08 crc kubenswrapper[4675]: I0216 19:43:08.034209 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:08 crc kubenswrapper[4675]: I0216 19:43:08.034221 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:08 crc kubenswrapper[4675]: I0216 19:43:08.034245 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:08 crc kubenswrapper[4675]: I0216 19:43:08.034257 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:08Z","lastTransitionTime":"2026-02-16T19:43:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:08 crc kubenswrapper[4675]: I0216 19:43:08.040509 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18c7d7a1f00611bdd6074314992cb6601ce0c043b831b6125010fe098c7e12ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:08Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:08 crc kubenswrapper[4675]: I0216 19:43:08.057195 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:08Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:08 crc kubenswrapper[4675]: I0216 19:43:08.073307 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:08Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:08 crc kubenswrapper[4675]: I0216 19:43:08.088225 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pj5xg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9a99563-d631-455f-8464-160e5619c610\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://798b2a0b186213c744675eb2bd1a33fcb05a8df10795373dbff1f91222fa17a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s94dl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pj5xg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:08Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:08 crc kubenswrapper[4675]: I0216 19:43:08.105535 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rg8bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f385df3d-0543-4189-88a6-2163c8b9b959\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f034148fa1b45c35dd6c8b9b70c08b1d745d1377335d7f6d2b35793c29ce9e03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://993eb6f1a848a5279ad2e5d47f47c873918dbca13d2ff7db80bebbad6f948476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://993eb6f1a848a5279ad2e5d47f47c873918dbca13d2ff7db80bebbad6f948476\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02bb51b679cec1e9962a9d52b4fa890f31df7d9c00ed9f4885e94ac52455341c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02bb51b679cec1e9962a9d52b4fa890f31df7d9c00ed9f4885e94ac52455341c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0161d628c911b5e33369708f774f6f8313fb74d7fdc1266b7bfa346d5d6200a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0161d628c911b5e33369708f774f6f8313fb74d7fdc1266b7bfa346d5d6200a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://470da14b61b6a4b60e151726b8a40431c5a38f30416cb9e635e843541e8b28bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://470da14b61b6a4b60e151726b8a40431c5a38f30416cb9e635e843541e8b28bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd74b2f78bf4719b66ec38da6783f0e4ee8b93d27d6c27907212e906c1cd8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbd74b2f78bf4719b66ec38da6783f0e4ee8b93d27d6c27907212e906c1cd8e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f742c6f36d6a537e143339eaa367ff65398f680da2e175aeaed1a4da1886148c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f742c6f36d6a537e143339eaa367ff65398f680da2e175aeaed1a4da1886148c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rg8bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:08Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:08 crc kubenswrapper[4675]: I0216 19:43:08.117806 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4vvwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2de9472b-0e23-4821-86b8-202bfd739aee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2a601bdf26e518854cf3a55f3aed6274701bc40b20fe0f0e7673e65a007f0b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqw54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4vvwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:08Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:08 crc kubenswrapper[4675]: I0216 19:43:08.131506 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"776ad1bc-395c-41c3-8c95-37a58a8cd4e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04dbf4934ecd5c1773fe7c1d032d68d59f41556af022bdf1c36f8de85221616a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abb4dfa515be84263cef6de8c300018193409908292cbfe0d19610ade9532ad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8d3160c906648c15ea910b1b2fa9223c716bbefac62b25bba4f83f3b50217b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd0dd7a8d986a0dd78568e33afa36fcd8bba88cf885d29b22484132bf0034c47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:08Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:08 crc kubenswrapper[4675]: I0216 19:43:08.136494 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:08 crc kubenswrapper[4675]: I0216 19:43:08.136551 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:08 crc kubenswrapper[4675]: I0216 19:43:08.136562 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:08 crc kubenswrapper[4675]: I0216 19:43:08.136579 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:08 crc kubenswrapper[4675]: I0216 19:43:08.136589 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:08Z","lastTransitionTime":"2026-02-16T19:43:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:08 crc kubenswrapper[4675]: I0216 19:43:08.150733 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04c59a1a3ae53156b955a90113bc3d3b4424b7dcc3baa91e8256fd0893751af8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:08Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:08 crc kubenswrapper[4675]: I0216 19:43:08.166903 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sbgjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d5a5b47-38a4-4f7e-b40e-dba4825e18be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm4b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm4b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sbgjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:08Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:08 crc kubenswrapper[4675]: I0216 19:43:08.184329 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10e2a1727464113d2cd88dfb8ba52bb1944b3dfe67a0d40f5b2e57fe483e5890\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83a4afeccd65f013b07c0f90cbb94f765f918aaaeeca716935f940deb85403d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:08Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:08 crc kubenswrapper[4675]: I0216 19:43:08.202081 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-stxk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af599569-feb0-432a-9adf-1b72d1ac1a57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5944f34bb922394197cfb79978d19139f70724115954043828e9f609ca32945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-stxk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:08Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:08 crc kubenswrapper[4675]: I0216 19:43:08.217096 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h9s8p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e98c1ed8-cbed-4d30-b3fc-41beff5d65c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://410f4deab4b31698f80da30412dff846ca9f1004dfab5d4cfe67c6892b3a7aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df2gr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20a2a1cc0b69b60d38a38ab898236d8577ce735cd2e1698f6fd3cb3d3c23b62c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df2gr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-h9s8p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:08Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:08 crc kubenswrapper[4675]: I0216 19:43:08.237340 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"358c3bab-4666-44d1-90fb-3affa22327e2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e0dcd3707a91b9a3869942408c144f0b4ca22cd64e99de5fe43f95094e44a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d3a4d507eb459edf9f2a3c58417ae320eba85a7b597808fdb0eabe7d7d5afbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd404f7efe7c09d47f0550b7cf66ac7193f315f60db74cd5609593546037c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bfda6b7519474256ab63750ba86955592a2e47a0e364e4170a19fe28270459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e9a715621e072260b3d9703159db5eaad4918ff7a2cb7689dcb914ccd617950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e8bfd47afcf3bbd1831b98e69a3ee5fe7b49dd04d4f2efc9d9dc8fda831c74a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e8bfd47afcf3bbd1831b98e69a3ee5fe7b49dd04d4f2efc9d9dc8fda831c74a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65752e9eeab5fea26d84c7d43a33947d3dbabdd31578480904db24d60d298a0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65752e9eeab5fea26d84c7d43a33947d3dbabdd31578480904db24d60d298a0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bd389e534c9b05c81db5cfb0a9f1f5e78e2f1ccbcce15953172836fb3fb6744f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd389e534c9b05c81db5cfb0a9f1f5e78e2f1ccbcce15953172836fb3fb6744f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:08Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:08 crc kubenswrapper[4675]: I0216 19:43:08.239347 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:08 crc kubenswrapper[4675]: I0216 19:43:08.239405 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:08 crc kubenswrapper[4675]: I0216 19:43:08.239417 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:08 crc kubenswrapper[4675]: I0216 19:43:08.239460 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:08 crc kubenswrapper[4675]: I0216 19:43:08.239475 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:08Z","lastTransitionTime":"2026-02-16T19:43:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:08 crc kubenswrapper[4675]: I0216 19:43:08.252458 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:08Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:08 crc kubenswrapper[4675]: I0216 19:43:08.285218 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://958d71da673d02ed4c067f390a0a94c9f20052f6274ec975f9a4c3034efa7aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5282559b57497e15c0bd828702b4994e8d05f7c1a81ee44e582ce3bb3b64cbb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13af49ba7abbe43edde46130d212f6d76f30bab85a63aef18a84721887e29535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e22795f3990f90b901e1ba026e5f95e72508dc865635d19ba881a291dbaed85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9296772284fdb3e0ac56ffd5e54f10e7fe1256ccb616d52f103479ebcc6e81f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ace6ee814e8e429b6ec77e88a0bb2fff35cfbfd3497e7a4d7882e11f437ce5a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5452aa74eda5605a3b5ad24d32c48b20ec9f2c479d5dd388e4307b79245e8f4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5452aa74eda5605a3b5ad24d32c48b20ec9f2c479d5dd388e4307b79245e8f4b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-16T19:42:55Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0216 19:42:55.399413 6340 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0216 19:42:55.399442 6340 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0216 19:42:55.399468 6340 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0216 19:42:55.399515 6340 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0216 19:42:55.399545 6340 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0216 19:42:55.399552 6340 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0216 19:42:55.399802 6340 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0216 19:42:55.399864 6340 factory.go:656] Stopping watch factory\\\\nI0216 19:42:55.399912 6340 ovnkube.go:599] Stopped ovnkube\\\\nI0216 19:42:55.399629 6340 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0216 19:42:55.399637 6340 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0216 19:42:55.399654 6340 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0216 19:42:55.399993 6340 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0216 19:42:55.400004 6340 handler.go:208] Removed *v1.Node event handler 7\\\\nI0216 19:42:55.400012 6340 handler.go:208] Removed *v1.Node event handler 2\\\\nI0216 19:42:5\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-gpc5z_openshift-ovn-kubernetes(9b6e2d5a-0472-425b-b5b4-0b94f14ebfba)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4a449e69885bb506900d7dbb14ee3ff9eb2d1188267babb04fbe0bb8f057c48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gpc5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:08Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:08 crc kubenswrapper[4675]: I0216 19:43:08.304121 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10414964-83d0-4d95-a89f-e3212a8015b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db3c34dee6cb538e5de05d8703350e3013401589489756aafbf8fed2ef597d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27l4j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92e33c01bc9214841f42d4ca6e58fb062a9e94fe39bdc5ad08ad2351cd270058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27l4j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j7pnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:08Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:08 crc kubenswrapper[4675]: I0216 19:43:08.320425 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc6bac98-79c6-4970-ba8b-7996a4046f7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055f2d1052fed3e5e8d660243c3d0595e3168a670ea7a22ef51b199528ffe826\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beefddcc0b5e2e84c15063f467b9043ba9caea11fbc3f7fb8fc1ece3b8c8f75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dbd2e87a95c7964a1bafb765ec76d7f18e180cc0e47df9fde7b526b3717a107\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cc4f98152f284333d41071adaefada973e3396c3cdb2c6ee3c662f0043fe65b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c762d1facdba3325880ae6b1e2704c626ae67915808b527959795774e2a1ba62\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-16T19:42:02Z\\\",\\\"message\\\":\\\"W0216 19:42:01.228572 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0216 19:42:01.228958 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771270921 cert, and key in /tmp/serving-cert-1169484463/serving-signer.crt, /tmp/serving-cert-1169484463/serving-signer.key\\\\nI0216 19:42:01.683881 1 observer_polling.go:159] Starting file observer\\\\nW0216 19:42:01.687387 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0216 19:42:01.693158 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0216 19:42:01.694799 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1169484463/tls.crt::/tmp/serving-cert-1169484463/tls.key\\\\\\\"\\\\nF0216 19:42:02.048422 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://344f8bd7bee340cc040a2556a0e92c055ed7fbad886f85750b66d21ace766e8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:08Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:08 crc kubenswrapper[4675]: I0216 19:43:08.336978 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3824464b-e8c8-4a8a-a50b-0ba14ad23d54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1eebdfea2b981083cc06feead15d85cc403b59837948716edf1a4649aff1bfc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eea7a27c1d937fcd2e1879b3eb16aacaf9733618822a2064c003a0e69828c4e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53ef6384f8ec4cb5976952d83be543b5a555d2ebcbe603a87889791803b1ae3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb0650dde5c6086d91ae092a0550467170f105f67c5cab5b0789f3c21d0c6966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb0650dde5c6086d91ae092a0550467170f105f67c5cab5b0789f3c21d0c6966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:08Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:08 crc kubenswrapper[4675]: I0216 19:43:08.345187 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:08 crc kubenswrapper[4675]: I0216 19:43:08.345252 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:08 crc kubenswrapper[4675]: I0216 19:43:08.345269 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:08 crc kubenswrapper[4675]: I0216 19:43:08.345294 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:08 crc kubenswrapper[4675]: I0216 19:43:08.345313 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:08Z","lastTransitionTime":"2026-02-16T19:43:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:08 crc kubenswrapper[4675]: I0216 19:43:08.354641 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18c7d7a1f00611bdd6074314992cb6601ce0c043b831b6125010fe098c7e12ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:08Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:08 crc kubenswrapper[4675]: I0216 19:43:08.373091 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:08Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:08 crc kubenswrapper[4675]: I0216 19:43:08.390389 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:08Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:08 crc kubenswrapper[4675]: I0216 19:43:08.404751 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pj5xg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9a99563-d631-455f-8464-160e5619c610\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://798b2a0b186213c744675eb2bd1a33fcb05a8df10795373dbff1f91222fa17a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s94dl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pj5xg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:08Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:08 crc kubenswrapper[4675]: I0216 19:43:08.421574 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rg8bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f385df3d-0543-4189-88a6-2163c8b9b959\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f034148fa1b45c35dd6c8b9b70c08b1d745d1377335d7f6d2b35793c29ce9e03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://993eb6f1a848a5279ad2e5d47f47c873918dbca13d2ff7db80bebbad6f948476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://993eb6f1a848a5279ad2e5d47f47c873918dbca13d2ff7db80bebbad6f948476\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02bb51b679cec1e9962a9d52b4fa890f31df7d9c00ed9f4885e94ac52455341c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02bb51b679cec1e9962a9d52b4fa890f31df7d9c00ed9f4885e94ac52455341c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0161d628c911b5e33369708f774f6f8313fb74d7fdc1266b7bfa346d5d6200a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0161d628c911b5e33369708f774f6f8313fb74d7fdc1266b7bfa346d5d6200a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://470da14b61b6a4b60e151726b8a40431c5a38f30416cb9e635e843541e8b28bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://470da14b61b6a4b60e151726b8a40431c5a38f30416cb9e635e843541e8b28bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd74b2f78bf4719b66ec38da6783f0e4ee8b93d27d6c27907212e906c1cd8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbd74b2f78bf4719b66ec38da6783f0e4ee8b93d27d6c27907212e906c1cd8e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f742c6f36d6a537e143339eaa367ff65398f680da2e175aeaed1a4da1886148c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f742c6f36d6a537e143339eaa367ff65398f680da2e175aeaed1a4da1886148c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rg8bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:08Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:08 crc kubenswrapper[4675]: I0216 19:43:08.433701 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4vvwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2de9472b-0e23-4821-86b8-202bfd739aee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2a601bdf26e518854cf3a55f3aed6274701bc40b20fe0f0e7673e65a007f0b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqw54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4vvwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:08Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:08 crc kubenswrapper[4675]: I0216 19:43:08.447037 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"776ad1bc-395c-41c3-8c95-37a58a8cd4e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04dbf4934ecd5c1773fe7c1d032d68d59f41556af022bdf1c36f8de85221616a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abb4dfa515be84263cef6de8c300018193409908292cbfe0d19610ade9532ad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8d3160c906648c15ea910b1b2fa9223c716bbefac62b25bba4f83f3b50217b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd0dd7a8d986a0dd78568e33afa36fcd8bba88cf885d29b22484132bf0034c47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:08Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:08 crc kubenswrapper[4675]: I0216 19:43:08.448272 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:08 crc kubenswrapper[4675]: I0216 19:43:08.448304 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:08 crc kubenswrapper[4675]: I0216 19:43:08.448312 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:08 crc kubenswrapper[4675]: I0216 19:43:08.448331 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:08 crc kubenswrapper[4675]: I0216 19:43:08.448343 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:08Z","lastTransitionTime":"2026-02-16T19:43:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:08 crc kubenswrapper[4675]: I0216 19:43:08.460116 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04c59a1a3ae53156b955a90113bc3d3b4424b7dcc3baa91e8256fd0893751af8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:08Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:08 crc kubenswrapper[4675]: I0216 19:43:08.473107 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sbgjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d5a5b47-38a4-4f7e-b40e-dba4825e18be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm4b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm4b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sbgjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:08Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:08 crc kubenswrapper[4675]: I0216 19:43:08.551196 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:08 crc kubenswrapper[4675]: I0216 19:43:08.551248 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:08 crc kubenswrapper[4675]: I0216 19:43:08.551261 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:08 crc kubenswrapper[4675]: I0216 19:43:08.551284 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:08 crc kubenswrapper[4675]: I0216 19:43:08.551297 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:08Z","lastTransitionTime":"2026-02-16T19:43:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:08 crc kubenswrapper[4675]: I0216 19:43:08.654705 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:08 crc kubenswrapper[4675]: I0216 19:43:08.654736 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:08 crc kubenswrapper[4675]: I0216 19:43:08.654745 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:08 crc kubenswrapper[4675]: I0216 19:43:08.654763 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:08 crc kubenswrapper[4675]: I0216 19:43:08.654773 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:08Z","lastTransitionTime":"2026-02-16T19:43:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:08 crc kubenswrapper[4675]: I0216 19:43:08.757717 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:08 crc kubenswrapper[4675]: I0216 19:43:08.757762 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:08 crc kubenswrapper[4675]: I0216 19:43:08.757774 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:08 crc kubenswrapper[4675]: I0216 19:43:08.757798 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:08 crc kubenswrapper[4675]: I0216 19:43:08.757813 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:08Z","lastTransitionTime":"2026-02-16T19:43:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:08 crc kubenswrapper[4675]: I0216 19:43:08.839541 4675 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 16:59:32.177229978 +0000 UTC Feb 16 19:43:08 crc kubenswrapper[4675]: I0216 19:43:08.861098 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:08 crc kubenswrapper[4675]: I0216 19:43:08.861158 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:08 crc kubenswrapper[4675]: I0216 19:43:08.861217 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:08 crc kubenswrapper[4675]: I0216 19:43:08.861240 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:08 crc kubenswrapper[4675]: I0216 19:43:08.861253 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:08Z","lastTransitionTime":"2026-02-16T19:43:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:08 crc kubenswrapper[4675]: I0216 19:43:08.883859 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 19:43:08 crc kubenswrapper[4675]: I0216 19:43:08.883885 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 19:43:08 crc kubenswrapper[4675]: I0216 19:43:08.883879 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 19:43:08 crc kubenswrapper[4675]: I0216 19:43:08.883906 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbgjb" Feb 16 19:43:08 crc kubenswrapper[4675]: E0216 19:43:08.884031 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 19:43:08 crc kubenswrapper[4675]: E0216 19:43:08.884583 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 19:43:08 crc kubenswrapper[4675]: E0216 19:43:08.884683 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 19:43:08 crc kubenswrapper[4675]: E0216 19:43:08.884681 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbgjb" podUID="8d5a5b47-38a4-4f7e-b40e-dba4825e18be" Feb 16 19:43:08 crc kubenswrapper[4675]: I0216 19:43:08.963869 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:08 crc kubenswrapper[4675]: I0216 19:43:08.963912 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:08 crc kubenswrapper[4675]: I0216 19:43:08.963924 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:08 crc kubenswrapper[4675]: I0216 19:43:08.963944 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:08 crc kubenswrapper[4675]: I0216 19:43:08.963957 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:08Z","lastTransitionTime":"2026-02-16T19:43:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:09 crc kubenswrapper[4675]: I0216 19:43:09.067363 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:09 crc kubenswrapper[4675]: I0216 19:43:09.067414 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:09 crc kubenswrapper[4675]: I0216 19:43:09.067426 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:09 crc kubenswrapper[4675]: I0216 19:43:09.067446 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:09 crc kubenswrapper[4675]: I0216 19:43:09.067464 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:09Z","lastTransitionTime":"2026-02-16T19:43:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:09 crc kubenswrapper[4675]: I0216 19:43:09.170601 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:09 crc kubenswrapper[4675]: I0216 19:43:09.170668 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:09 crc kubenswrapper[4675]: I0216 19:43:09.170679 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:09 crc kubenswrapper[4675]: I0216 19:43:09.170756 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:09 crc kubenswrapper[4675]: I0216 19:43:09.170777 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:09Z","lastTransitionTime":"2026-02-16T19:43:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:09 crc kubenswrapper[4675]: I0216 19:43:09.273390 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:09 crc kubenswrapper[4675]: I0216 19:43:09.273426 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:09 crc kubenswrapper[4675]: I0216 19:43:09.273435 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:09 crc kubenswrapper[4675]: I0216 19:43:09.273450 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:09 crc kubenswrapper[4675]: I0216 19:43:09.273462 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:09Z","lastTransitionTime":"2026-02-16T19:43:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:09 crc kubenswrapper[4675]: I0216 19:43:09.378105 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:09 crc kubenswrapper[4675]: I0216 19:43:09.378175 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:09 crc kubenswrapper[4675]: I0216 19:43:09.378187 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:09 crc kubenswrapper[4675]: I0216 19:43:09.378210 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:09 crc kubenswrapper[4675]: I0216 19:43:09.378225 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:09Z","lastTransitionTime":"2026-02-16T19:43:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:09 crc kubenswrapper[4675]: I0216 19:43:09.480726 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:09 crc kubenswrapper[4675]: I0216 19:43:09.480780 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:09 crc kubenswrapper[4675]: I0216 19:43:09.480797 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:09 crc kubenswrapper[4675]: I0216 19:43:09.480821 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:09 crc kubenswrapper[4675]: I0216 19:43:09.480836 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:09Z","lastTransitionTime":"2026-02-16T19:43:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:09 crc kubenswrapper[4675]: I0216 19:43:09.584047 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:09 crc kubenswrapper[4675]: I0216 19:43:09.584092 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:09 crc kubenswrapper[4675]: I0216 19:43:09.584105 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:09 crc kubenswrapper[4675]: I0216 19:43:09.584125 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:09 crc kubenswrapper[4675]: I0216 19:43:09.584138 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:09Z","lastTransitionTime":"2026-02-16T19:43:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:09 crc kubenswrapper[4675]: I0216 19:43:09.687552 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:09 crc kubenswrapper[4675]: I0216 19:43:09.687603 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:09 crc kubenswrapper[4675]: I0216 19:43:09.687615 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:09 crc kubenswrapper[4675]: I0216 19:43:09.687636 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:09 crc kubenswrapper[4675]: I0216 19:43:09.687647 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:09Z","lastTransitionTime":"2026-02-16T19:43:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:09 crc kubenswrapper[4675]: I0216 19:43:09.791048 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:09 crc kubenswrapper[4675]: I0216 19:43:09.791085 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:09 crc kubenswrapper[4675]: I0216 19:43:09.791100 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:09 crc kubenswrapper[4675]: I0216 19:43:09.791119 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:09 crc kubenswrapper[4675]: I0216 19:43:09.791132 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:09Z","lastTransitionTime":"2026-02-16T19:43:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:09 crc kubenswrapper[4675]: I0216 19:43:09.839973 4675 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 23:21:23.327216099 +0000 UTC Feb 16 19:43:09 crc kubenswrapper[4675]: I0216 19:43:09.893245 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:09 crc kubenswrapper[4675]: I0216 19:43:09.893290 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:09 crc kubenswrapper[4675]: I0216 19:43:09.893300 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:09 crc kubenswrapper[4675]: I0216 19:43:09.893315 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:09 crc kubenswrapper[4675]: I0216 19:43:09.893328 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:09Z","lastTransitionTime":"2026-02-16T19:43:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:09 crc kubenswrapper[4675]: I0216 19:43:09.997077 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:09 crc kubenswrapper[4675]: I0216 19:43:09.997124 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:09 crc kubenswrapper[4675]: I0216 19:43:09.997136 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:09 crc kubenswrapper[4675]: I0216 19:43:09.997155 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:09 crc kubenswrapper[4675]: I0216 19:43:09.997169 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:09Z","lastTransitionTime":"2026-02-16T19:43:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:10 crc kubenswrapper[4675]: I0216 19:43:10.099784 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:10 crc kubenswrapper[4675]: I0216 19:43:10.099831 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:10 crc kubenswrapper[4675]: I0216 19:43:10.099841 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:10 crc kubenswrapper[4675]: I0216 19:43:10.099865 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:10 crc kubenswrapper[4675]: I0216 19:43:10.099879 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:10Z","lastTransitionTime":"2026-02-16T19:43:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:10 crc kubenswrapper[4675]: I0216 19:43:10.203087 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:10 crc kubenswrapper[4675]: I0216 19:43:10.203148 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:10 crc kubenswrapper[4675]: I0216 19:43:10.203161 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:10 crc kubenswrapper[4675]: I0216 19:43:10.203184 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:10 crc kubenswrapper[4675]: I0216 19:43:10.203197 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:10Z","lastTransitionTime":"2026-02-16T19:43:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:10 crc kubenswrapper[4675]: I0216 19:43:10.307516 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:10 crc kubenswrapper[4675]: I0216 19:43:10.307586 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:10 crc kubenswrapper[4675]: I0216 19:43:10.307605 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:10 crc kubenswrapper[4675]: I0216 19:43:10.307633 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:10 crc kubenswrapper[4675]: I0216 19:43:10.307653 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:10Z","lastTransitionTime":"2026-02-16T19:43:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:10 crc kubenswrapper[4675]: I0216 19:43:10.411078 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:10 crc kubenswrapper[4675]: I0216 19:43:10.411138 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:10 crc kubenswrapper[4675]: I0216 19:43:10.411161 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:10 crc kubenswrapper[4675]: I0216 19:43:10.411192 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:10 crc kubenswrapper[4675]: I0216 19:43:10.411217 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:10Z","lastTransitionTime":"2026-02-16T19:43:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:10 crc kubenswrapper[4675]: I0216 19:43:10.514170 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:10 crc kubenswrapper[4675]: I0216 19:43:10.514231 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:10 crc kubenswrapper[4675]: I0216 19:43:10.514243 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:10 crc kubenswrapper[4675]: I0216 19:43:10.514263 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:10 crc kubenswrapper[4675]: I0216 19:43:10.514275 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:10Z","lastTransitionTime":"2026-02-16T19:43:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:10 crc kubenswrapper[4675]: I0216 19:43:10.617278 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:10 crc kubenswrapper[4675]: I0216 19:43:10.617344 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:10 crc kubenswrapper[4675]: I0216 19:43:10.617357 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:10 crc kubenswrapper[4675]: I0216 19:43:10.617379 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:10 crc kubenswrapper[4675]: I0216 19:43:10.617392 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:10Z","lastTransitionTime":"2026-02-16T19:43:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:10 crc kubenswrapper[4675]: I0216 19:43:10.720706 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:10 crc kubenswrapper[4675]: I0216 19:43:10.720777 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:10 crc kubenswrapper[4675]: I0216 19:43:10.720790 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:10 crc kubenswrapper[4675]: I0216 19:43:10.720814 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:10 crc kubenswrapper[4675]: I0216 19:43:10.720827 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:10Z","lastTransitionTime":"2026-02-16T19:43:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:10 crc kubenswrapper[4675]: I0216 19:43:10.823838 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:10 crc kubenswrapper[4675]: I0216 19:43:10.823891 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:10 crc kubenswrapper[4675]: I0216 19:43:10.823904 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:10 crc kubenswrapper[4675]: I0216 19:43:10.823923 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:10 crc kubenswrapper[4675]: I0216 19:43:10.823939 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:10Z","lastTransitionTime":"2026-02-16T19:43:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:10 crc kubenswrapper[4675]: I0216 19:43:10.840308 4675 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-23 10:05:20.52614967 +0000 UTC Feb 16 19:43:10 crc kubenswrapper[4675]: I0216 19:43:10.883868 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 19:43:10 crc kubenswrapper[4675]: I0216 19:43:10.883905 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbgjb" Feb 16 19:43:10 crc kubenswrapper[4675]: I0216 19:43:10.883961 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 19:43:10 crc kubenswrapper[4675]: I0216 19:43:10.883886 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 19:43:10 crc kubenswrapper[4675]: E0216 19:43:10.884030 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 19:43:10 crc kubenswrapper[4675]: E0216 19:43:10.884164 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 19:43:10 crc kubenswrapper[4675]: E0216 19:43:10.884252 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbgjb" podUID="8d5a5b47-38a4-4f7e-b40e-dba4825e18be" Feb 16 19:43:10 crc kubenswrapper[4675]: E0216 19:43:10.884352 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 19:43:10 crc kubenswrapper[4675]: I0216 19:43:10.927836 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:10 crc kubenswrapper[4675]: I0216 19:43:10.927911 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:10 crc kubenswrapper[4675]: I0216 19:43:10.927931 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:10 crc kubenswrapper[4675]: I0216 19:43:10.927959 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:10 crc kubenswrapper[4675]: I0216 19:43:10.927979 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:10Z","lastTransitionTime":"2026-02-16T19:43:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:11 crc kubenswrapper[4675]: I0216 19:43:11.031796 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:11 crc kubenswrapper[4675]: I0216 19:43:11.031866 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:11 crc kubenswrapper[4675]: I0216 19:43:11.031881 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:11 crc kubenswrapper[4675]: I0216 19:43:11.031904 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:11 crc kubenswrapper[4675]: I0216 19:43:11.031918 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:11Z","lastTransitionTime":"2026-02-16T19:43:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:11 crc kubenswrapper[4675]: I0216 19:43:11.134340 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:11 crc kubenswrapper[4675]: I0216 19:43:11.134389 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:11 crc kubenswrapper[4675]: I0216 19:43:11.134403 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:11 crc kubenswrapper[4675]: I0216 19:43:11.134425 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:11 crc kubenswrapper[4675]: I0216 19:43:11.134442 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:11Z","lastTransitionTime":"2026-02-16T19:43:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:11 crc kubenswrapper[4675]: I0216 19:43:11.237539 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:11 crc kubenswrapper[4675]: I0216 19:43:11.237592 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:11 crc kubenswrapper[4675]: I0216 19:43:11.237602 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:11 crc kubenswrapper[4675]: I0216 19:43:11.237623 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:11 crc kubenswrapper[4675]: I0216 19:43:11.237634 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:11Z","lastTransitionTime":"2026-02-16T19:43:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:11 crc kubenswrapper[4675]: I0216 19:43:11.340954 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:11 crc kubenswrapper[4675]: I0216 19:43:11.341014 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:11 crc kubenswrapper[4675]: I0216 19:43:11.341027 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:11 crc kubenswrapper[4675]: I0216 19:43:11.341046 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:11 crc kubenswrapper[4675]: I0216 19:43:11.341059 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:11Z","lastTransitionTime":"2026-02-16T19:43:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:11 crc kubenswrapper[4675]: I0216 19:43:11.444313 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:11 crc kubenswrapper[4675]: I0216 19:43:11.444409 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:11 crc kubenswrapper[4675]: I0216 19:43:11.444427 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:11 crc kubenswrapper[4675]: I0216 19:43:11.444463 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:11 crc kubenswrapper[4675]: I0216 19:43:11.444483 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:11Z","lastTransitionTime":"2026-02-16T19:43:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:11 crc kubenswrapper[4675]: I0216 19:43:11.547974 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:11 crc kubenswrapper[4675]: I0216 19:43:11.548038 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:11 crc kubenswrapper[4675]: I0216 19:43:11.548049 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:11 crc kubenswrapper[4675]: I0216 19:43:11.548078 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:11 crc kubenswrapper[4675]: I0216 19:43:11.548091 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:11Z","lastTransitionTime":"2026-02-16T19:43:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:11 crc kubenswrapper[4675]: I0216 19:43:11.651450 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:11 crc kubenswrapper[4675]: I0216 19:43:11.652461 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:11 crc kubenswrapper[4675]: I0216 19:43:11.652533 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:11 crc kubenswrapper[4675]: I0216 19:43:11.652560 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:11 crc kubenswrapper[4675]: I0216 19:43:11.652578 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:11Z","lastTransitionTime":"2026-02-16T19:43:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:11 crc kubenswrapper[4675]: I0216 19:43:11.755365 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:11 crc kubenswrapper[4675]: I0216 19:43:11.755450 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:11 crc kubenswrapper[4675]: I0216 19:43:11.755460 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:11 crc kubenswrapper[4675]: I0216 19:43:11.755482 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:11 crc kubenswrapper[4675]: I0216 19:43:11.755494 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:11Z","lastTransitionTime":"2026-02-16T19:43:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:11 crc kubenswrapper[4675]: I0216 19:43:11.840941 4675 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 14:42:33.00261703 +0000 UTC Feb 16 19:43:11 crc kubenswrapper[4675]: I0216 19:43:11.858714 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:11 crc kubenswrapper[4675]: I0216 19:43:11.858759 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:11 crc kubenswrapper[4675]: I0216 19:43:11.858769 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:11 crc kubenswrapper[4675]: I0216 19:43:11.858788 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:11 crc kubenswrapper[4675]: I0216 19:43:11.858800 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:11Z","lastTransitionTime":"2026-02-16T19:43:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:11 crc kubenswrapper[4675]: I0216 19:43:11.961859 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:11 crc kubenswrapper[4675]: I0216 19:43:11.961905 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:11 crc kubenswrapper[4675]: I0216 19:43:11.961915 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:11 crc kubenswrapper[4675]: I0216 19:43:11.961929 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:11 crc kubenswrapper[4675]: I0216 19:43:11.961940 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:11Z","lastTransitionTime":"2026-02-16T19:43:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:12 crc kubenswrapper[4675]: I0216 19:43:12.065778 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:12 crc kubenswrapper[4675]: I0216 19:43:12.065874 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:12 crc kubenswrapper[4675]: I0216 19:43:12.065887 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:12 crc kubenswrapper[4675]: I0216 19:43:12.065928 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:12 crc kubenswrapper[4675]: I0216 19:43:12.065941 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:12Z","lastTransitionTime":"2026-02-16T19:43:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:12 crc kubenswrapper[4675]: I0216 19:43:12.169650 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:12 crc kubenswrapper[4675]: I0216 19:43:12.169805 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:12 crc kubenswrapper[4675]: I0216 19:43:12.169830 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:12 crc kubenswrapper[4675]: I0216 19:43:12.169865 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:12 crc kubenswrapper[4675]: I0216 19:43:12.169888 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:12Z","lastTransitionTime":"2026-02-16T19:43:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:12 crc kubenswrapper[4675]: I0216 19:43:12.272789 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:12 crc kubenswrapper[4675]: I0216 19:43:12.272845 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:12 crc kubenswrapper[4675]: I0216 19:43:12.272865 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:12 crc kubenswrapper[4675]: I0216 19:43:12.272893 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:12 crc kubenswrapper[4675]: I0216 19:43:12.272915 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:12Z","lastTransitionTime":"2026-02-16T19:43:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:12 crc kubenswrapper[4675]: I0216 19:43:12.376584 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:12 crc kubenswrapper[4675]: I0216 19:43:12.376638 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:12 crc kubenswrapper[4675]: I0216 19:43:12.376657 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:12 crc kubenswrapper[4675]: I0216 19:43:12.376681 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:12 crc kubenswrapper[4675]: I0216 19:43:12.376728 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:12Z","lastTransitionTime":"2026-02-16T19:43:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:12 crc kubenswrapper[4675]: I0216 19:43:12.389171 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pj5xg_c9a99563-d631-455f-8464-160e5619c610/kube-multus/0.log" Feb 16 19:43:12 crc kubenswrapper[4675]: I0216 19:43:12.389253 4675 generic.go:334] "Generic (PLEG): container finished" podID="c9a99563-d631-455f-8464-160e5619c610" containerID="798b2a0b186213c744675eb2bd1a33fcb05a8df10795373dbff1f91222fa17a9" exitCode=1 Feb 16 19:43:12 crc kubenswrapper[4675]: I0216 19:43:12.389303 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pj5xg" event={"ID":"c9a99563-d631-455f-8464-160e5619c610","Type":"ContainerDied","Data":"798b2a0b186213c744675eb2bd1a33fcb05a8df10795373dbff1f91222fa17a9"} Feb 16 19:43:12 crc kubenswrapper[4675]: I0216 19:43:12.390019 4675 scope.go:117] "RemoveContainer" containerID="798b2a0b186213c744675eb2bd1a33fcb05a8df10795373dbff1f91222fa17a9" Feb 16 19:43:12 crc kubenswrapper[4675]: I0216 19:43:12.414814 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"776ad1bc-395c-41c3-8c95-37a58a8cd4e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04dbf4934ecd5c1773fe7c1d032d68d59f41556af022bdf1c36f8de85221616a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abb4dfa515be84263cef6de8c300018193409908292cbfe0d19610ade9532ad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8d3160c906648c15ea910b1b2fa9223c716bbefac62b25bba4f83f3b50217b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd0dd7a8d986a0dd78568e33afa36fcd8bba88cf885d29b22484132bf0034c47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:12Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:12 crc kubenswrapper[4675]: I0216 19:43:12.434551 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04c59a1a3ae53156b955a90113bc3d3b4424b7dcc3baa91e8256fd0893751af8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:12Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:12 crc kubenswrapper[4675]: I0216 19:43:12.449560 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sbgjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d5a5b47-38a4-4f7e-b40e-dba4825e18be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm4b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm4b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sbgjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:12Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:12 crc kubenswrapper[4675]: I0216 19:43:12.473267 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10e2a1727464113d2cd88dfb8ba52bb1944b3dfe67a0d40f5b2e57fe483e5890\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83a4afeccd65f013b07c0f90cbb94f765f918aaaeeca716935f940deb85403d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:12Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:12 crc kubenswrapper[4675]: I0216 19:43:12.484372 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:12 crc kubenswrapper[4675]: I0216 19:43:12.484482 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:12 crc kubenswrapper[4675]: I0216 19:43:12.484497 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:12 crc kubenswrapper[4675]: I0216 19:43:12.484548 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:12 crc kubenswrapper[4675]: I0216 19:43:12.484566 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:12Z","lastTransitionTime":"2026-02-16T19:43:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:12 crc kubenswrapper[4675]: I0216 19:43:12.491508 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-stxk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af599569-feb0-432a-9adf-1b72d1ac1a57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5944f34bb922394197cfb79978d19139f70724115954043828e9f609ca32945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-stxk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:12Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:12 crc kubenswrapper[4675]: I0216 19:43:12.514671 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h9s8p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e98c1ed8-cbed-4d30-b3fc-41beff5d65c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://410f4deab4b31698f80da30412dff846ca9f1004dfab5d4cfe67c6892b3a7aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df2gr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20a2a1cc0b69b60d38a38ab898236d8577ce735cd2e1698f6fd3cb3d3c23b62c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df2gr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-h9s8p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:12Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:12 crc kubenswrapper[4675]: I0216 19:43:12.548016 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"358c3bab-4666-44d1-90fb-3affa22327e2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e0dcd3707a91b9a3869942408c144f0b4ca22cd64e99de5fe43f95094e44a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d3a4d507eb459edf9f2a3c58417ae320eba85a7b597808fdb0eabe7d7d5afbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd404f7efe7c09d47f0550b7cf66ac7193f315f60db74cd5609593546037c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bfda6b7519474256ab63750ba86955592a2e47a0e364e4170a19fe28270459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e9a715621e072260b3d9703159db5eaad4918ff7a2cb7689dcb914ccd617950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e8bfd47afcf3bbd1831b98e69a3ee5fe7b49dd04d4f2efc9d9dc8fda831c74a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e8bfd47afcf3bbd1831b98e69a3ee5fe7b49dd04d4f2efc9d9dc8fda831c74a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65752e9eeab5fea26d84c7d43a33947d3dbabdd31578480904db24d60d298a0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65752e9eeab5fea26d84c7d43a33947d3dbabdd31578480904db24d60d298a0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bd389e534c9b05c81db5cfb0a9f1f5e78e2f1ccbcce15953172836fb3fb6744f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd389e534c9b05c81db5cfb0a9f1f5e78e2f1ccbcce15953172836fb3fb6744f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:12Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:12 crc kubenswrapper[4675]: I0216 19:43:12.567095 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:12Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:12 crc kubenswrapper[4675]: I0216 19:43:12.586061 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:12Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:12 crc kubenswrapper[4675]: I0216 19:43:12.588103 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:12 crc kubenswrapper[4675]: I0216 19:43:12.588199 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:12 crc kubenswrapper[4675]: I0216 19:43:12.588220 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:12 crc kubenswrapper[4675]: I0216 19:43:12.588865 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:12 crc kubenswrapper[4675]: I0216 19:43:12.588921 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:12Z","lastTransitionTime":"2026-02-16T19:43:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:12 crc kubenswrapper[4675]: I0216 19:43:12.607120 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pj5xg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9a99563-d631-455f-8464-160e5619c610\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:43:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:43:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://798b2a0b186213c744675eb2bd1a33fcb05a8df10795373dbff1f91222fa17a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://798b2a0b186213c744675eb2bd1a33fcb05a8df10795373dbff1f91222fa17a9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-16T19:43:11Z\\\",\\\"message\\\":\\\"2026-02-16T19:42:26+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9b0c1c65-b0b3-415a-8e3c-d91f805b2976\\\\n2026-02-16T19:42:26+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9b0c1c65-b0b3-415a-8e3c-d91f805b2976 to /host/opt/cni/bin/\\\\n2026-02-16T19:42:26Z [verbose] multus-daemon started\\\\n2026-02-16T19:42:26Z [verbose] Readiness Indicator file check\\\\n2026-02-16T19:43:11Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s94dl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pj5xg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:12Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:12 crc kubenswrapper[4675]: I0216 19:43:12.631474 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://958d71da673d02ed4c067f390a0a94c9f20052f6274ec975f9a4c3034efa7aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5282559b57497e15c0bd828702b4994e8d05f7c1a81ee44e582ce3bb3b64cbb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13af49ba7abbe43edde46130d212f6d76f30bab85a63aef18a84721887e29535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e22795f3990f90b901e1ba026e5f95e72508dc865635d19ba881a291dbaed85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9296772284fdb3e0ac56ffd5e54f10e7fe1256ccb616d52f103479ebcc6e81f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ace6ee814e8e429b6ec77e88a0bb2fff35cfbfd3497e7a4d7882e11f437ce5a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5452aa74eda5605a3b5ad24d32c48b20ec9f2c479d5dd388e4307b79245e8f4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5452aa74eda5605a3b5ad24d32c48b20ec9f2c479d5dd388e4307b79245e8f4b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-16T19:42:55Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0216 19:42:55.399413 6340 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0216 19:42:55.399442 6340 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0216 19:42:55.399468 6340 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0216 19:42:55.399515 6340 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0216 19:42:55.399545 6340 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0216 19:42:55.399552 6340 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0216 19:42:55.399802 6340 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0216 19:42:55.399864 6340 factory.go:656] Stopping watch factory\\\\nI0216 19:42:55.399912 6340 ovnkube.go:599] Stopped ovnkube\\\\nI0216 19:42:55.399629 6340 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0216 19:42:55.399637 6340 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0216 19:42:55.399654 6340 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0216 19:42:55.399993 6340 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0216 19:42:55.400004 6340 handler.go:208] Removed *v1.Node event handler 7\\\\nI0216 19:42:55.400012 6340 handler.go:208] Removed *v1.Node event handler 2\\\\nI0216 19:42:5\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-gpc5z_openshift-ovn-kubernetes(9b6e2d5a-0472-425b-b5b4-0b94f14ebfba)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4a449e69885bb506900d7dbb14ee3ff9eb2d1188267babb04fbe0bb8f057c48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gpc5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:12Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:12 crc kubenswrapper[4675]: I0216 19:43:12.646438 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10414964-83d0-4d95-a89f-e3212a8015b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db3c34dee6cb538e5de05d8703350e3013401589489756aafbf8fed2ef597d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27l4j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92e33c01bc9214841f42d4ca6e58fb062a9e94fe39bdc5ad08ad2351cd270058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27l4j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j7pnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:12Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:12 crc kubenswrapper[4675]: I0216 19:43:12.662953 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc6bac98-79c6-4970-ba8b-7996a4046f7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055f2d1052fed3e5e8d660243c3d0595e3168a670ea7a22ef51b199528ffe826\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beefddcc0b5e2e84c15063f467b9043ba9caea11fbc3f7fb8fc1ece3b8c8f75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dbd2e87a95c7964a1bafb765ec76d7f18e180cc0e47df9fde7b526b3717a107\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cc4f98152f284333d41071adaefada973e3396c3cdb2c6ee3c662f0043fe65b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c762d1facdba3325880ae6b1e2704c626ae67915808b527959795774e2a1ba62\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-16T19:42:02Z\\\",\\\"message\\\":\\\"W0216 19:42:01.228572 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0216 19:42:01.228958 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771270921 cert, and key in /tmp/serving-cert-1169484463/serving-signer.crt, /tmp/serving-cert-1169484463/serving-signer.key\\\\nI0216 19:42:01.683881 1 observer_polling.go:159] Starting file observer\\\\nW0216 19:42:01.687387 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0216 19:42:01.693158 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0216 19:42:01.694799 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1169484463/tls.crt::/tmp/serving-cert-1169484463/tls.key\\\\\\\"\\\\nF0216 19:42:02.048422 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://344f8bd7bee340cc040a2556a0e92c055ed7fbad886f85750b66d21ace766e8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:12Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:12 crc kubenswrapper[4675]: I0216 19:43:12.679910 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3824464b-e8c8-4a8a-a50b-0ba14ad23d54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1eebdfea2b981083cc06feead15d85cc403b59837948716edf1a4649aff1bfc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eea7a27c1d937fcd2e1879b3eb16aacaf9733618822a2064c003a0e69828c4e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53ef6384f8ec4cb5976952d83be543b5a555d2ebcbe603a87889791803b1ae3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb0650dde5c6086d91ae092a0550467170f105f67c5cab5b0789f3c21d0c6966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb0650dde5c6086d91ae092a0550467170f105f67c5cab5b0789f3c21d0c6966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:12Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:12 crc kubenswrapper[4675]: I0216 19:43:12.695962 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:12 crc kubenswrapper[4675]: I0216 19:43:12.696059 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:12 crc kubenswrapper[4675]: I0216 19:43:12.696112 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:12 crc kubenswrapper[4675]: I0216 19:43:12.696141 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:12 crc kubenswrapper[4675]: I0216 19:43:12.696159 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:12Z","lastTransitionTime":"2026-02-16T19:43:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:12 crc kubenswrapper[4675]: I0216 19:43:12.695959 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18c7d7a1f00611bdd6074314992cb6601ce0c043b831b6125010fe098c7e12ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:12Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:12 crc kubenswrapper[4675]: I0216 19:43:12.711263 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:12Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:12 crc kubenswrapper[4675]: I0216 19:43:12.728209 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rg8bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f385df3d-0543-4189-88a6-2163c8b9b959\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f034148fa1b45c35dd6c8b9b70c08b1d745d1377335d7f6d2b35793c29ce9e03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://993eb6f1a848a5279ad2e5d47f47c873918dbca13d2ff7db80bebbad6f948476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://993eb6f1a848a5279ad2e5d47f47c873918dbca13d2ff7db80bebbad6f948476\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02bb51b679cec1e9962a9d52b4fa890f31df7d9c00ed9f4885e94ac52455341c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02bb51b679cec1e9962a9d52b4fa890f31df7d9c00ed9f4885e94ac52455341c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0161d628c911b5e33369708f774f6f8313fb74d7fdc1266b7bfa346d5d6200a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0161d628c911b5e33369708f774f6f8313fb74d7fdc1266b7bfa346d5d6200a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://470da14b61b6a4b60e151726b8a40431c5a38f30416cb9e635e843541e8b28bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://470da14b61b6a4b60e151726b8a40431c5a38f30416cb9e635e843541e8b28bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd74b2f78bf4719b66ec38da6783f0e4ee8b93d27d6c27907212e906c1cd8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbd74b2f78bf4719b66ec38da6783f0e4ee8b93d27d6c27907212e906c1cd8e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f742c6f36d6a537e143339eaa367ff65398f680da2e175aeaed1a4da1886148c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f742c6f36d6a537e143339eaa367ff65398f680da2e175aeaed1a4da1886148c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rg8bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:12Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:12 crc kubenswrapper[4675]: I0216 19:43:12.744055 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4vvwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2de9472b-0e23-4821-86b8-202bfd739aee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2a601bdf26e518854cf3a55f3aed6274701bc40b20fe0f0e7673e65a007f0b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqw54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4vvwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:12Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:12 crc kubenswrapper[4675]: I0216 19:43:12.798578 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:12 crc kubenswrapper[4675]: I0216 19:43:12.799092 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:12 crc kubenswrapper[4675]: I0216 19:43:12.799111 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:12 crc kubenswrapper[4675]: I0216 19:43:12.799134 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:12 crc kubenswrapper[4675]: I0216 19:43:12.799150 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:12Z","lastTransitionTime":"2026-02-16T19:43:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:12 crc kubenswrapper[4675]: I0216 19:43:12.841498 4675 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 17:04:56.215599265 +0000 UTC Feb 16 19:43:12 crc kubenswrapper[4675]: I0216 19:43:12.883869 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 19:43:12 crc kubenswrapper[4675]: E0216 19:43:12.884075 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 19:43:12 crc kubenswrapper[4675]: I0216 19:43:12.884180 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 19:43:12 crc kubenswrapper[4675]: E0216 19:43:12.884269 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 19:43:12 crc kubenswrapper[4675]: I0216 19:43:12.884364 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbgjb" Feb 16 19:43:12 crc kubenswrapper[4675]: E0216 19:43:12.884496 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbgjb" podUID="8d5a5b47-38a4-4f7e-b40e-dba4825e18be" Feb 16 19:43:12 crc kubenswrapper[4675]: I0216 19:43:12.884586 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 19:43:12 crc kubenswrapper[4675]: E0216 19:43:12.884758 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 19:43:12 crc kubenswrapper[4675]: I0216 19:43:12.902289 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:12 crc kubenswrapper[4675]: I0216 19:43:12.902341 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:12 crc kubenswrapper[4675]: I0216 19:43:12.902359 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:12 crc kubenswrapper[4675]: I0216 19:43:12.902383 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:12 crc kubenswrapper[4675]: I0216 19:43:12.902403 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:12Z","lastTransitionTime":"2026-02-16T19:43:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:13 crc kubenswrapper[4675]: I0216 19:43:13.005156 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:13 crc kubenswrapper[4675]: I0216 19:43:13.005232 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:13 crc kubenswrapper[4675]: I0216 19:43:13.005251 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:13 crc kubenswrapper[4675]: I0216 19:43:13.005280 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:13 crc kubenswrapper[4675]: I0216 19:43:13.005299 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:13Z","lastTransitionTime":"2026-02-16T19:43:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:13 crc kubenswrapper[4675]: I0216 19:43:13.109562 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:13 crc kubenswrapper[4675]: I0216 19:43:13.109630 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:13 crc kubenswrapper[4675]: I0216 19:43:13.109653 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:13 crc kubenswrapper[4675]: I0216 19:43:13.109681 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:13 crc kubenswrapper[4675]: I0216 19:43:13.109735 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:13Z","lastTransitionTime":"2026-02-16T19:43:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:13 crc kubenswrapper[4675]: I0216 19:43:13.213869 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:13 crc kubenswrapper[4675]: I0216 19:43:13.213954 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:13 crc kubenswrapper[4675]: I0216 19:43:13.213977 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:13 crc kubenswrapper[4675]: I0216 19:43:13.214007 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:13 crc kubenswrapper[4675]: I0216 19:43:13.214025 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:13Z","lastTransitionTime":"2026-02-16T19:43:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:13 crc kubenswrapper[4675]: I0216 19:43:13.318192 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:13 crc kubenswrapper[4675]: I0216 19:43:13.318245 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:13 crc kubenswrapper[4675]: I0216 19:43:13.318264 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:13 crc kubenswrapper[4675]: I0216 19:43:13.318292 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:13 crc kubenswrapper[4675]: I0216 19:43:13.318314 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:13Z","lastTransitionTime":"2026-02-16T19:43:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:13 crc kubenswrapper[4675]: I0216 19:43:13.399349 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pj5xg_c9a99563-d631-455f-8464-160e5619c610/kube-multus/0.log" Feb 16 19:43:13 crc kubenswrapper[4675]: I0216 19:43:13.399493 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pj5xg" event={"ID":"c9a99563-d631-455f-8464-160e5619c610","Type":"ContainerStarted","Data":"4e43206e68596f3aac0d8c5d9c093d1b5aa57306577c54fc003ab051950ceeac"} Feb 16 19:43:13 crc kubenswrapper[4675]: I0216 19:43:13.421571 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:13 crc kubenswrapper[4675]: I0216 19:43:13.421621 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:13 crc kubenswrapper[4675]: I0216 19:43:13.421633 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:13 crc kubenswrapper[4675]: I0216 19:43:13.421653 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:13 crc kubenswrapper[4675]: I0216 19:43:13.421664 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:13Z","lastTransitionTime":"2026-02-16T19:43:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:13 crc kubenswrapper[4675]: I0216 19:43:13.426230 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04c59a1a3ae53156b955a90113bc3d3b4424b7dcc3baa91e8256fd0893751af8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:13Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:13 crc kubenswrapper[4675]: I0216 19:43:13.447981 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sbgjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d5a5b47-38a4-4f7e-b40e-dba4825e18be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm4b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm4b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sbgjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:13Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:13 crc kubenswrapper[4675]: I0216 19:43:13.473169 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"776ad1bc-395c-41c3-8c95-37a58a8cd4e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04dbf4934ecd5c1773fe7c1d032d68d59f41556af022bdf1c36f8de85221616a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abb4dfa515be84263cef6de8c300018193409908292cbfe0d19610ade9532ad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8d3160c906648c15ea910b1b2fa9223c716bbefac62b25bba4f83f3b50217b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd0dd7a8d986a0dd78568e33afa36fcd8bba88cf885d29b22484132bf0034c47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:13Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:13 crc kubenswrapper[4675]: I0216 19:43:13.501368 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-stxk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af599569-feb0-432a-9adf-1b72d1ac1a57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5944f34bb922394197cfb79978d19139f70724115954043828e9f609ca32945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-stxk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:13Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:13 crc kubenswrapper[4675]: I0216 19:43:13.523201 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h9s8p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e98c1ed8-cbed-4d30-b3fc-41beff5d65c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://410f4deab4b31698f80da30412dff846ca9f1004dfab5d4cfe67c6892b3a7aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df2gr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20a2a1cc0b69b60d38a38ab898236d8577ce735cd2e1698f6fd3cb3d3c23b62c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df2gr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-h9s8p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:13Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:13 crc kubenswrapper[4675]: I0216 19:43:13.525764 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:13 crc kubenswrapper[4675]: I0216 19:43:13.526013 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:13 crc kubenswrapper[4675]: I0216 19:43:13.526231 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:13 crc kubenswrapper[4675]: I0216 19:43:13.526429 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:13 crc kubenswrapper[4675]: I0216 19:43:13.526997 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:13Z","lastTransitionTime":"2026-02-16T19:43:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:13 crc kubenswrapper[4675]: I0216 19:43:13.543168 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10e2a1727464113d2cd88dfb8ba52bb1944b3dfe67a0d40f5b2e57fe483e5890\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83a4afeccd65f013b07c0f90cbb94f765f918aaaeeca716935f940deb85403d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:13Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:13 crc kubenswrapper[4675]: I0216 19:43:13.565361 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:13Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:13 crc kubenswrapper[4675]: I0216 19:43:13.592027 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"358c3bab-4666-44d1-90fb-3affa22327e2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e0dcd3707a91b9a3869942408c144f0b4ca22cd64e99de5fe43f95094e44a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d3a4d507eb459edf9f2a3c58417ae320eba85a7b597808fdb0eabe7d7d5afbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd404f7efe7c09d47f0550b7cf66ac7193f315f60db74cd5609593546037c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bfda6b7519474256ab63750ba86955592a2e47a0e364e4170a19fe28270459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e9a715621e072260b3d9703159db5eaad4918ff7a2cb7689dcb914ccd617950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e8bfd47afcf3bbd1831b98e69a3ee5fe7b49dd04d4f2efc9d9dc8fda831c74a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e8bfd47afcf3bbd1831b98e69a3ee5fe7b49dd04d4f2efc9d9dc8fda831c74a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65752e9eeab5fea26d84c7d43a33947d3dbabdd31578480904db24d60d298a0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65752e9eeab5fea26d84c7d43a33947d3dbabdd31578480904db24d60d298a0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bd389e534c9b05c81db5cfb0a9f1f5e78e2f1ccbcce15953172836fb3fb6744f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd389e534c9b05c81db5cfb0a9f1f5e78e2f1ccbcce15953172836fb3fb6744f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:13Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:13 crc kubenswrapper[4675]: I0216 19:43:13.611308 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3824464b-e8c8-4a8a-a50b-0ba14ad23d54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1eebdfea2b981083cc06feead15d85cc403b59837948716edf1a4649aff1bfc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eea7a27c1d937fcd2e1879b3eb16aacaf9733618822a2064c003a0e69828c4e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53ef6384f8ec4cb5976952d83be543b5a555d2ebcbe603a87889791803b1ae3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb0650dde5c6086d91ae092a0550467170f105f67c5cab5b0789f3c21d0c6966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb0650dde5c6086d91ae092a0550467170f105f67c5cab5b0789f3c21d0c6966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:13Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:13 crc kubenswrapper[4675]: I0216 19:43:13.630755 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:13 crc kubenswrapper[4675]: I0216 19:43:13.630818 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:13 crc kubenswrapper[4675]: I0216 19:43:13.630840 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:13 crc kubenswrapper[4675]: I0216 19:43:13.630872 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:13 crc kubenswrapper[4675]: I0216 19:43:13.630893 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:13Z","lastTransitionTime":"2026-02-16T19:43:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:13 crc kubenswrapper[4675]: I0216 19:43:13.634520 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18c7d7a1f00611bdd6074314992cb6601ce0c043b831b6125010fe098c7e12ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:13Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:13 crc kubenswrapper[4675]: I0216 19:43:13.653444 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:13Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:13 crc kubenswrapper[4675]: I0216 19:43:13.672614 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:13Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:13 crc kubenswrapper[4675]: I0216 19:43:13.692847 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pj5xg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9a99563-d631-455f-8464-160e5619c610\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:43:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:43:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e43206e68596f3aac0d8c5d9c093d1b5aa57306577c54fc003ab051950ceeac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://798b2a0b186213c744675eb2bd1a33fcb05a8df10795373dbff1f91222fa17a9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-16T19:43:11Z\\\",\\\"message\\\":\\\"2026-02-16T19:42:26+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9b0c1c65-b0b3-415a-8e3c-d91f805b2976\\\\n2026-02-16T19:42:26+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9b0c1c65-b0b3-415a-8e3c-d91f805b2976 to /host/opt/cni/bin/\\\\n2026-02-16T19:42:26Z [verbose] multus-daemon started\\\\n2026-02-16T19:42:26Z [verbose] Readiness Indicator file check\\\\n2026-02-16T19:43:11Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:43:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s94dl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pj5xg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:13Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:13 crc kubenswrapper[4675]: I0216 19:43:13.719240 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://958d71da673d02ed4c067f390a0a94c9f20052f6274ec975f9a4c3034efa7aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5282559b57497e15c0bd828702b4994e8d05f7c1a81ee44e582ce3bb3b64cbb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13af49ba7abbe43edde46130d212f6d76f30bab85a63aef18a84721887e29535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e22795f3990f90b901e1ba026e5f95e72508dc865635d19ba881a291dbaed85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9296772284fdb3e0ac56ffd5e54f10e7fe1256ccb616d52f103479ebcc6e81f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ace6ee814e8e429b6ec77e88a0bb2fff35cfbfd3497e7a4d7882e11f437ce5a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5452aa74eda5605a3b5ad24d32c48b20ec9f2c479d5dd388e4307b79245e8f4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5452aa74eda5605a3b5ad24d32c48b20ec9f2c479d5dd388e4307b79245e8f4b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-16T19:42:55Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0216 19:42:55.399413 6340 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0216 19:42:55.399442 6340 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0216 19:42:55.399468 6340 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0216 19:42:55.399515 6340 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0216 19:42:55.399545 6340 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0216 19:42:55.399552 6340 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0216 19:42:55.399802 6340 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0216 19:42:55.399864 6340 factory.go:656] Stopping watch factory\\\\nI0216 19:42:55.399912 6340 ovnkube.go:599] Stopped ovnkube\\\\nI0216 19:42:55.399629 6340 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0216 19:42:55.399637 6340 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0216 19:42:55.399654 6340 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0216 19:42:55.399993 6340 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0216 19:42:55.400004 6340 handler.go:208] Removed *v1.Node event handler 7\\\\nI0216 19:42:55.400012 6340 handler.go:208] Removed *v1.Node event handler 2\\\\nI0216 19:42:5\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-gpc5z_openshift-ovn-kubernetes(9b6e2d5a-0472-425b-b5b4-0b94f14ebfba)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4a449e69885bb506900d7dbb14ee3ff9eb2d1188267babb04fbe0bb8f057c48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gpc5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:13Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:13 crc kubenswrapper[4675]: I0216 19:43:13.733814 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:13 crc kubenswrapper[4675]: I0216 19:43:13.733895 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:13 crc kubenswrapper[4675]: I0216 19:43:13.733907 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:13 crc kubenswrapper[4675]: I0216 19:43:13.733929 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:13 crc kubenswrapper[4675]: I0216 19:43:13.733942 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:13Z","lastTransitionTime":"2026-02-16T19:43:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:13 crc kubenswrapper[4675]: I0216 19:43:13.737548 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10414964-83d0-4d95-a89f-e3212a8015b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db3c34dee6cb538e5de05d8703350e3013401589489756aafbf8fed2ef597d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27l4j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92e33c01bc9214841f42d4ca6e58fb062a9e94fe39bdc5ad08ad2351cd270058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27l4j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j7pnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:13Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:13 crc kubenswrapper[4675]: I0216 19:43:13.797918 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc6bac98-79c6-4970-ba8b-7996a4046f7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055f2d1052fed3e5e8d660243c3d0595e3168a670ea7a22ef51b199528ffe826\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beefddcc0b5e2e84c15063f467b9043ba9caea11fbc3f7fb8fc1ece3b8c8f75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dbd2e87a95c7964a1bafb765ec76d7f18e180cc0e47df9fde7b526b3717a107\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cc4f98152f284333d41071adaefada973e3396c3cdb2c6ee3c662f0043fe65b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c762d1facdba3325880ae6b1e2704c626ae67915808b527959795774e2a1ba62\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-16T19:42:02Z\\\",\\\"message\\\":\\\"W0216 19:42:01.228572 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0216 19:42:01.228958 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771270921 cert, and key in /tmp/serving-cert-1169484463/serving-signer.crt, /tmp/serving-cert-1169484463/serving-signer.key\\\\nI0216 19:42:01.683881 1 observer_polling.go:159] Starting file observer\\\\nW0216 19:42:01.687387 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0216 19:42:01.693158 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0216 19:42:01.694799 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1169484463/tls.crt::/tmp/serving-cert-1169484463/tls.key\\\\\\\"\\\\nF0216 19:42:02.048422 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://344f8bd7bee340cc040a2556a0e92c055ed7fbad886f85750b66d21ace766e8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:13Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:13 crc kubenswrapper[4675]: I0216 19:43:13.816417 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4vvwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2de9472b-0e23-4821-86b8-202bfd739aee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2a601bdf26e518854cf3a55f3aed6274701bc40b20fe0f0e7673e65a007f0b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqw54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4vvwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:13Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:13 crc kubenswrapper[4675]: I0216 19:43:13.836873 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:13 crc kubenswrapper[4675]: I0216 19:43:13.836925 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:13 crc kubenswrapper[4675]: I0216 19:43:13.836942 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:13 crc kubenswrapper[4675]: I0216 19:43:13.836966 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:13 crc kubenswrapper[4675]: I0216 19:43:13.836984 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:13Z","lastTransitionTime":"2026-02-16T19:43:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:13 crc kubenswrapper[4675]: I0216 19:43:13.842115 4675 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 16:16:54.247031992 +0000 UTC Feb 16 19:43:13 crc kubenswrapper[4675]: I0216 19:43:13.842641 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rg8bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f385df3d-0543-4189-88a6-2163c8b9b959\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f034148fa1b45c35dd6c8b9b70c08b1d745d1377335d7f6d2b35793c29ce9e03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://993eb6f1a848a5279ad2e5d47f47c873918dbca13d2ff7db80bebbad6f948476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://993eb6f1a848a5279ad2e5d47f47c873918dbca13d2ff7db80bebbad6f948476\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02bb51b679cec1e9962a9d52b4fa890f31df7d9c00ed9f4885e94ac52455341c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02bb51b679cec1e9962a9d52b4fa890f31df7d9c00ed9f4885e94ac52455341c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0161d628c911b5e33369708f774f6f8313fb74d7fdc1266b7bfa346d5d6200a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0161d628c911b5e33369708f774f6f8313fb74d7fdc1266b7bfa346d5d6200a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://470da14b61b6a4b60e151726b8a40431c5a38f30416cb9e635e843541e8b28bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://470da14b61b6a4b60e151726b8a40431c5a38f30416cb9e635e843541e8b28bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd74b2f78bf4719b66ec38da6783f0e4ee8b93d27d6c27907212e906c1cd8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbd74b2f78bf4719b66ec38da6783f0e4ee8b93d27d6c27907212e906c1cd8e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f742c6f36d6a537e143339eaa367ff65398f680da2e175aeaed1a4da1886148c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f742c6f36d6a537e143339eaa367ff65398f680da2e175aeaed1a4da1886148c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rg8bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:13Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:13 crc kubenswrapper[4675]: I0216 19:43:13.939350 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:13 crc kubenswrapper[4675]: I0216 19:43:13.939416 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:13 crc kubenswrapper[4675]: I0216 19:43:13.939428 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:13 crc kubenswrapper[4675]: I0216 19:43:13.939447 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:13 crc kubenswrapper[4675]: I0216 19:43:13.939460 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:13Z","lastTransitionTime":"2026-02-16T19:43:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:14 crc kubenswrapper[4675]: I0216 19:43:14.043321 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:14 crc kubenswrapper[4675]: I0216 19:43:14.043397 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:14 crc kubenswrapper[4675]: I0216 19:43:14.043415 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:14 crc kubenswrapper[4675]: I0216 19:43:14.043444 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:14 crc kubenswrapper[4675]: I0216 19:43:14.043466 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:14Z","lastTransitionTime":"2026-02-16T19:43:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:14 crc kubenswrapper[4675]: I0216 19:43:14.147375 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:14 crc kubenswrapper[4675]: I0216 19:43:14.147439 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:14 crc kubenswrapper[4675]: I0216 19:43:14.147457 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:14 crc kubenswrapper[4675]: I0216 19:43:14.147480 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:14 crc kubenswrapper[4675]: I0216 19:43:14.147501 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:14Z","lastTransitionTime":"2026-02-16T19:43:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:14 crc kubenswrapper[4675]: I0216 19:43:14.251371 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:14 crc kubenswrapper[4675]: I0216 19:43:14.251461 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:14 crc kubenswrapper[4675]: I0216 19:43:14.251501 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:14 crc kubenswrapper[4675]: I0216 19:43:14.251550 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:14 crc kubenswrapper[4675]: I0216 19:43:14.251580 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:14Z","lastTransitionTime":"2026-02-16T19:43:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:14 crc kubenswrapper[4675]: I0216 19:43:14.355604 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:14 crc kubenswrapper[4675]: I0216 19:43:14.355647 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:14 crc kubenswrapper[4675]: I0216 19:43:14.355657 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:14 crc kubenswrapper[4675]: I0216 19:43:14.355674 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:14 crc kubenswrapper[4675]: I0216 19:43:14.355699 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:14Z","lastTransitionTime":"2026-02-16T19:43:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:14 crc kubenswrapper[4675]: I0216 19:43:14.458646 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:14 crc kubenswrapper[4675]: I0216 19:43:14.458727 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:14 crc kubenswrapper[4675]: I0216 19:43:14.458747 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:14 crc kubenswrapper[4675]: I0216 19:43:14.458770 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:14 crc kubenswrapper[4675]: I0216 19:43:14.458788 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:14Z","lastTransitionTime":"2026-02-16T19:43:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:14 crc kubenswrapper[4675]: I0216 19:43:14.562342 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:14 crc kubenswrapper[4675]: I0216 19:43:14.562382 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:14 crc kubenswrapper[4675]: I0216 19:43:14.562394 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:14 crc kubenswrapper[4675]: I0216 19:43:14.562413 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:14 crc kubenswrapper[4675]: I0216 19:43:14.562424 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:14Z","lastTransitionTime":"2026-02-16T19:43:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:14 crc kubenswrapper[4675]: I0216 19:43:14.666165 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:14 crc kubenswrapper[4675]: I0216 19:43:14.666309 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:14 crc kubenswrapper[4675]: I0216 19:43:14.666332 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:14 crc kubenswrapper[4675]: I0216 19:43:14.666404 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:14 crc kubenswrapper[4675]: I0216 19:43:14.666428 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:14Z","lastTransitionTime":"2026-02-16T19:43:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:14 crc kubenswrapper[4675]: I0216 19:43:14.770126 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:14 crc kubenswrapper[4675]: I0216 19:43:14.770196 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:14 crc kubenswrapper[4675]: I0216 19:43:14.770230 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:14 crc kubenswrapper[4675]: I0216 19:43:14.770257 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:14 crc kubenswrapper[4675]: I0216 19:43:14.770275 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:14Z","lastTransitionTime":"2026-02-16T19:43:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:14 crc kubenswrapper[4675]: I0216 19:43:14.842926 4675 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 15:43:33.502019579 +0000 UTC Feb 16 19:43:14 crc kubenswrapper[4675]: I0216 19:43:14.873500 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:14 crc kubenswrapper[4675]: I0216 19:43:14.873675 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:14 crc kubenswrapper[4675]: I0216 19:43:14.873766 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:14 crc kubenswrapper[4675]: I0216 19:43:14.873876 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:14 crc kubenswrapper[4675]: I0216 19:43:14.873962 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:14Z","lastTransitionTime":"2026-02-16T19:43:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:14 crc kubenswrapper[4675]: I0216 19:43:14.883933 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 19:43:14 crc kubenswrapper[4675]: E0216 19:43:14.884150 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 19:43:14 crc kubenswrapper[4675]: I0216 19:43:14.883970 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 19:43:14 crc kubenswrapper[4675]: E0216 19:43:14.884411 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 19:43:14 crc kubenswrapper[4675]: I0216 19:43:14.883927 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbgjb" Feb 16 19:43:14 crc kubenswrapper[4675]: E0216 19:43:14.884625 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbgjb" podUID="8d5a5b47-38a4-4f7e-b40e-dba4825e18be" Feb 16 19:43:14 crc kubenswrapper[4675]: I0216 19:43:14.883996 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 19:43:14 crc kubenswrapper[4675]: E0216 19:43:14.884871 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 19:43:14 crc kubenswrapper[4675]: I0216 19:43:14.977411 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:14 crc kubenswrapper[4675]: I0216 19:43:14.977483 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:14 crc kubenswrapper[4675]: I0216 19:43:14.977495 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:14 crc kubenswrapper[4675]: I0216 19:43:14.977515 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:14 crc kubenswrapper[4675]: I0216 19:43:14.977527 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:14Z","lastTransitionTime":"2026-02-16T19:43:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:15 crc kubenswrapper[4675]: I0216 19:43:15.081589 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:15 crc kubenswrapper[4675]: I0216 19:43:15.081725 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:15 crc kubenswrapper[4675]: I0216 19:43:15.081755 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:15 crc kubenswrapper[4675]: I0216 19:43:15.081790 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:15 crc kubenswrapper[4675]: I0216 19:43:15.081809 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:15Z","lastTransitionTime":"2026-02-16T19:43:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:15 crc kubenswrapper[4675]: I0216 19:43:15.184436 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:15 crc kubenswrapper[4675]: I0216 19:43:15.184494 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:15 crc kubenswrapper[4675]: I0216 19:43:15.184504 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:15 crc kubenswrapper[4675]: I0216 19:43:15.184523 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:15 crc kubenswrapper[4675]: I0216 19:43:15.184534 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:15Z","lastTransitionTime":"2026-02-16T19:43:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:15 crc kubenswrapper[4675]: I0216 19:43:15.287748 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:15 crc kubenswrapper[4675]: I0216 19:43:15.287788 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:15 crc kubenswrapper[4675]: I0216 19:43:15.287797 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:15 crc kubenswrapper[4675]: I0216 19:43:15.287812 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:15 crc kubenswrapper[4675]: I0216 19:43:15.287825 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:15Z","lastTransitionTime":"2026-02-16T19:43:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:15 crc kubenswrapper[4675]: I0216 19:43:15.390897 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:15 crc kubenswrapper[4675]: I0216 19:43:15.390939 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:15 crc kubenswrapper[4675]: I0216 19:43:15.390949 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:15 crc kubenswrapper[4675]: I0216 19:43:15.390968 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:15 crc kubenswrapper[4675]: I0216 19:43:15.390982 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:15Z","lastTransitionTime":"2026-02-16T19:43:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:15 crc kubenswrapper[4675]: I0216 19:43:15.494030 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:15 crc kubenswrapper[4675]: I0216 19:43:15.494126 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:15 crc kubenswrapper[4675]: I0216 19:43:15.494148 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:15 crc kubenswrapper[4675]: I0216 19:43:15.494176 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:15 crc kubenswrapper[4675]: I0216 19:43:15.494196 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:15Z","lastTransitionTime":"2026-02-16T19:43:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:15 crc kubenswrapper[4675]: I0216 19:43:15.603255 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:15 crc kubenswrapper[4675]: I0216 19:43:15.603338 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:15 crc kubenswrapper[4675]: I0216 19:43:15.603358 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:15 crc kubenswrapper[4675]: I0216 19:43:15.603390 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:15 crc kubenswrapper[4675]: I0216 19:43:15.603410 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:15Z","lastTransitionTime":"2026-02-16T19:43:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:15 crc kubenswrapper[4675]: I0216 19:43:15.706225 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:15 crc kubenswrapper[4675]: I0216 19:43:15.706285 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:15 crc kubenswrapper[4675]: I0216 19:43:15.706298 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:15 crc kubenswrapper[4675]: I0216 19:43:15.706316 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:15 crc kubenswrapper[4675]: I0216 19:43:15.706330 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:15Z","lastTransitionTime":"2026-02-16T19:43:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:15 crc kubenswrapper[4675]: I0216 19:43:15.809786 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:15 crc kubenswrapper[4675]: I0216 19:43:15.809856 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:15 crc kubenswrapper[4675]: I0216 19:43:15.809874 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:15 crc kubenswrapper[4675]: I0216 19:43:15.809901 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:15 crc kubenswrapper[4675]: I0216 19:43:15.809921 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:15Z","lastTransitionTime":"2026-02-16T19:43:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:15 crc kubenswrapper[4675]: I0216 19:43:15.844310 4675 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 18:29:45.169261296 +0000 UTC Feb 16 19:43:15 crc kubenswrapper[4675]: I0216 19:43:15.912630 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:15 crc kubenswrapper[4675]: I0216 19:43:15.912677 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:15 crc kubenswrapper[4675]: I0216 19:43:15.912704 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:15 crc kubenswrapper[4675]: I0216 19:43:15.912720 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:15 crc kubenswrapper[4675]: I0216 19:43:15.912732 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:15Z","lastTransitionTime":"2026-02-16T19:43:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:16 crc kubenswrapper[4675]: I0216 19:43:16.015511 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:16 crc kubenswrapper[4675]: I0216 19:43:16.015571 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:16 crc kubenswrapper[4675]: I0216 19:43:16.015590 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:16 crc kubenswrapper[4675]: I0216 19:43:16.015617 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:16 crc kubenswrapper[4675]: I0216 19:43:16.015635 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:16Z","lastTransitionTime":"2026-02-16T19:43:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:16 crc kubenswrapper[4675]: I0216 19:43:16.119065 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:16 crc kubenswrapper[4675]: I0216 19:43:16.119119 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:16 crc kubenswrapper[4675]: I0216 19:43:16.119131 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:16 crc kubenswrapper[4675]: I0216 19:43:16.119151 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:16 crc kubenswrapper[4675]: I0216 19:43:16.119164 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:16Z","lastTransitionTime":"2026-02-16T19:43:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:16 crc kubenswrapper[4675]: I0216 19:43:16.221991 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:16 crc kubenswrapper[4675]: I0216 19:43:16.222053 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:16 crc kubenswrapper[4675]: I0216 19:43:16.222070 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:16 crc kubenswrapper[4675]: I0216 19:43:16.222098 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:16 crc kubenswrapper[4675]: I0216 19:43:16.222116 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:16Z","lastTransitionTime":"2026-02-16T19:43:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:16 crc kubenswrapper[4675]: I0216 19:43:16.324744 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:16 crc kubenswrapper[4675]: I0216 19:43:16.324789 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:16 crc kubenswrapper[4675]: I0216 19:43:16.324800 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:16 crc kubenswrapper[4675]: I0216 19:43:16.324819 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:16 crc kubenswrapper[4675]: I0216 19:43:16.324829 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:16Z","lastTransitionTime":"2026-02-16T19:43:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:16 crc kubenswrapper[4675]: I0216 19:43:16.426885 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:16 crc kubenswrapper[4675]: I0216 19:43:16.426988 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:16 crc kubenswrapper[4675]: I0216 19:43:16.427012 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:16 crc kubenswrapper[4675]: I0216 19:43:16.427043 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:16 crc kubenswrapper[4675]: I0216 19:43:16.427064 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:16Z","lastTransitionTime":"2026-02-16T19:43:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:16 crc kubenswrapper[4675]: I0216 19:43:16.529948 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:16 crc kubenswrapper[4675]: I0216 19:43:16.530024 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:16 crc kubenswrapper[4675]: I0216 19:43:16.530045 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:16 crc kubenswrapper[4675]: I0216 19:43:16.530073 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:16 crc kubenswrapper[4675]: I0216 19:43:16.530092 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:16Z","lastTransitionTime":"2026-02-16T19:43:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:16 crc kubenswrapper[4675]: I0216 19:43:16.607811 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:16 crc kubenswrapper[4675]: I0216 19:43:16.607884 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:16 crc kubenswrapper[4675]: I0216 19:43:16.607905 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:16 crc kubenswrapper[4675]: I0216 19:43:16.607933 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:16 crc kubenswrapper[4675]: I0216 19:43:16.607951 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:16Z","lastTransitionTime":"2026-02-16T19:43:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:16 crc kubenswrapper[4675]: E0216 19:43:16.624878 4675 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:43:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:43:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:43:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:43:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:43:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:43:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:43:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:43:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dc074e08-7429-4b93-941d-3d8335487f37\\\",\\\"systemUUID\\\":\\\"a98cec76-f68c-4af0-9517-b326e9616347\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:16Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:16 crc kubenswrapper[4675]: I0216 19:43:16.629478 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:16 crc kubenswrapper[4675]: I0216 19:43:16.629548 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:16 crc kubenswrapper[4675]: I0216 19:43:16.629573 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:16 crc kubenswrapper[4675]: I0216 19:43:16.629607 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:16 crc kubenswrapper[4675]: I0216 19:43:16.629630 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:16Z","lastTransitionTime":"2026-02-16T19:43:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:16 crc kubenswrapper[4675]: E0216 19:43:16.647074 4675 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:43:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:43:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:43:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:43:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:43:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:43:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:43:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:43:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dc074e08-7429-4b93-941d-3d8335487f37\\\",\\\"systemUUID\\\":\\\"a98cec76-f68c-4af0-9517-b326e9616347\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:16Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:16 crc kubenswrapper[4675]: I0216 19:43:16.651425 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:16 crc kubenswrapper[4675]: I0216 19:43:16.651476 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:16 crc kubenswrapper[4675]: I0216 19:43:16.651498 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:16 crc kubenswrapper[4675]: I0216 19:43:16.651526 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:16 crc kubenswrapper[4675]: I0216 19:43:16.651547 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:16Z","lastTransitionTime":"2026-02-16T19:43:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:16 crc kubenswrapper[4675]: E0216 19:43:16.664921 4675 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:43:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:43:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:43:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:43:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:43:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:43:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:43:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:43:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dc074e08-7429-4b93-941d-3d8335487f37\\\",\\\"systemUUID\\\":\\\"a98cec76-f68c-4af0-9517-b326e9616347\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:16Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:16 crc kubenswrapper[4675]: I0216 19:43:16.669890 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:16 crc kubenswrapper[4675]: I0216 19:43:16.669942 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:16 crc kubenswrapper[4675]: I0216 19:43:16.669952 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:16 crc kubenswrapper[4675]: I0216 19:43:16.669970 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:16 crc kubenswrapper[4675]: I0216 19:43:16.669984 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:16Z","lastTransitionTime":"2026-02-16T19:43:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:16 crc kubenswrapper[4675]: E0216 19:43:16.683550 4675 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:43:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:43:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:43:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:43:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:43:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:43:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:43:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:43:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dc074e08-7429-4b93-941d-3d8335487f37\\\",\\\"systemUUID\\\":\\\"a98cec76-f68c-4af0-9517-b326e9616347\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:16Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:16 crc kubenswrapper[4675]: I0216 19:43:16.687346 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:16 crc kubenswrapper[4675]: I0216 19:43:16.687391 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:16 crc kubenswrapper[4675]: I0216 19:43:16.687400 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:16 crc kubenswrapper[4675]: I0216 19:43:16.687415 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:16 crc kubenswrapper[4675]: I0216 19:43:16.687425 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:16Z","lastTransitionTime":"2026-02-16T19:43:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:16 crc kubenswrapper[4675]: E0216 19:43:16.703545 4675 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:43:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:43:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:43:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:43:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:43:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:43:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:43:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:43:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dc074e08-7429-4b93-941d-3d8335487f37\\\",\\\"systemUUID\\\":\\\"a98cec76-f68c-4af0-9517-b326e9616347\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:16Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:16 crc kubenswrapper[4675]: E0216 19:43:16.703677 4675 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 16 19:43:16 crc kubenswrapper[4675]: I0216 19:43:16.705916 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:16 crc kubenswrapper[4675]: I0216 19:43:16.705989 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:16 crc kubenswrapper[4675]: I0216 19:43:16.706014 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:16 crc kubenswrapper[4675]: I0216 19:43:16.706043 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:16 crc kubenswrapper[4675]: I0216 19:43:16.706066 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:16Z","lastTransitionTime":"2026-02-16T19:43:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:16 crc kubenswrapper[4675]: I0216 19:43:16.808769 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:16 crc kubenswrapper[4675]: I0216 19:43:16.808834 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:16 crc kubenswrapper[4675]: I0216 19:43:16.808857 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:16 crc kubenswrapper[4675]: I0216 19:43:16.808889 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:16 crc kubenswrapper[4675]: I0216 19:43:16.808915 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:16Z","lastTransitionTime":"2026-02-16T19:43:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:16 crc kubenswrapper[4675]: I0216 19:43:16.844912 4675 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 00:33:39.059740135 +0000 UTC Feb 16 19:43:16 crc kubenswrapper[4675]: I0216 19:43:16.883552 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbgjb" Feb 16 19:43:16 crc kubenswrapper[4675]: I0216 19:43:16.883609 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 19:43:16 crc kubenswrapper[4675]: I0216 19:43:16.883581 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 19:43:16 crc kubenswrapper[4675]: I0216 19:43:16.883551 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 19:43:16 crc kubenswrapper[4675]: E0216 19:43:16.883802 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbgjb" podUID="8d5a5b47-38a4-4f7e-b40e-dba4825e18be" Feb 16 19:43:16 crc kubenswrapper[4675]: E0216 19:43:16.883981 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 19:43:16 crc kubenswrapper[4675]: E0216 19:43:16.884046 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 19:43:16 crc kubenswrapper[4675]: E0216 19:43:16.884110 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 19:43:16 crc kubenswrapper[4675]: I0216 19:43:16.912574 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:16 crc kubenswrapper[4675]: I0216 19:43:16.912632 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:16 crc kubenswrapper[4675]: I0216 19:43:16.912650 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:16 crc kubenswrapper[4675]: I0216 19:43:16.912683 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:16 crc kubenswrapper[4675]: I0216 19:43:16.912742 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:16Z","lastTransitionTime":"2026-02-16T19:43:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:17 crc kubenswrapper[4675]: I0216 19:43:17.016899 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:17 crc kubenswrapper[4675]: I0216 19:43:17.016996 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:17 crc kubenswrapper[4675]: I0216 19:43:17.017020 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:17 crc kubenswrapper[4675]: I0216 19:43:17.017053 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:17 crc kubenswrapper[4675]: I0216 19:43:17.017078 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:17Z","lastTransitionTime":"2026-02-16T19:43:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:17 crc kubenswrapper[4675]: I0216 19:43:17.120618 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:17 crc kubenswrapper[4675]: I0216 19:43:17.120666 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:17 crc kubenswrapper[4675]: I0216 19:43:17.120682 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:17 crc kubenswrapper[4675]: I0216 19:43:17.120735 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:17 crc kubenswrapper[4675]: I0216 19:43:17.120754 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:17Z","lastTransitionTime":"2026-02-16T19:43:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:17 crc kubenswrapper[4675]: I0216 19:43:17.224720 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:17 crc kubenswrapper[4675]: I0216 19:43:17.224775 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:17 crc kubenswrapper[4675]: I0216 19:43:17.224793 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:17 crc kubenswrapper[4675]: I0216 19:43:17.224816 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:17 crc kubenswrapper[4675]: I0216 19:43:17.224834 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:17Z","lastTransitionTime":"2026-02-16T19:43:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:17 crc kubenswrapper[4675]: I0216 19:43:17.328336 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:17 crc kubenswrapper[4675]: I0216 19:43:17.328388 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:17 crc kubenswrapper[4675]: I0216 19:43:17.328405 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:17 crc kubenswrapper[4675]: I0216 19:43:17.328428 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:17 crc kubenswrapper[4675]: I0216 19:43:17.328446 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:17Z","lastTransitionTime":"2026-02-16T19:43:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:17 crc kubenswrapper[4675]: I0216 19:43:17.430756 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:17 crc kubenswrapper[4675]: I0216 19:43:17.430818 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:17 crc kubenswrapper[4675]: I0216 19:43:17.430833 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:17 crc kubenswrapper[4675]: I0216 19:43:17.430854 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:17 crc kubenswrapper[4675]: I0216 19:43:17.430866 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:17Z","lastTransitionTime":"2026-02-16T19:43:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:17 crc kubenswrapper[4675]: I0216 19:43:17.534078 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:17 crc kubenswrapper[4675]: I0216 19:43:17.534166 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:17 crc kubenswrapper[4675]: I0216 19:43:17.534187 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:17 crc kubenswrapper[4675]: I0216 19:43:17.534217 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:17 crc kubenswrapper[4675]: I0216 19:43:17.534241 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:17Z","lastTransitionTime":"2026-02-16T19:43:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:17 crc kubenswrapper[4675]: I0216 19:43:17.637342 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:17 crc kubenswrapper[4675]: I0216 19:43:17.637419 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:17 crc kubenswrapper[4675]: I0216 19:43:17.637438 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:17 crc kubenswrapper[4675]: I0216 19:43:17.637465 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:17 crc kubenswrapper[4675]: I0216 19:43:17.637484 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:17Z","lastTransitionTime":"2026-02-16T19:43:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:17 crc kubenswrapper[4675]: I0216 19:43:17.740535 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:17 crc kubenswrapper[4675]: I0216 19:43:17.740584 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:17 crc kubenswrapper[4675]: I0216 19:43:17.740597 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:17 crc kubenswrapper[4675]: I0216 19:43:17.740619 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:17 crc kubenswrapper[4675]: I0216 19:43:17.740633 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:17Z","lastTransitionTime":"2026-02-16T19:43:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:17 crc kubenswrapper[4675]: I0216 19:43:17.845045 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:17 crc kubenswrapper[4675]: I0216 19:43:17.845147 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:17 crc kubenswrapper[4675]: I0216 19:43:17.845310 4675 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 18:32:19.705562127 +0000 UTC Feb 16 19:43:17 crc kubenswrapper[4675]: I0216 19:43:17.845173 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:17 crc kubenswrapper[4675]: I0216 19:43:17.845752 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:17 crc kubenswrapper[4675]: I0216 19:43:17.845787 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:17Z","lastTransitionTime":"2026-02-16T19:43:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:17 crc kubenswrapper[4675]: I0216 19:43:17.899719 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-stxk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af599569-feb0-432a-9adf-1b72d1ac1a57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5944f34bb922394197cfb79978d19139f70724115954043828e9f609ca32945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-stxk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:17Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:17 crc kubenswrapper[4675]: I0216 19:43:17.920286 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h9s8p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e98c1ed8-cbed-4d30-b3fc-41beff5d65c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://410f4deab4b31698f80da30412dff846ca9f1004dfab5d4cfe67c6892b3a7aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df2gr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20a2a1cc0b69b60d38a38ab898236d8577ce735cd2e1698f6fd3cb3d3c23b62c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df2gr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-h9s8p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:17Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:17 crc kubenswrapper[4675]: I0216 19:43:17.944604 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10e2a1727464113d2cd88dfb8ba52bb1944b3dfe67a0d40f5b2e57fe483e5890\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83a4afeccd65f013b07c0f90cbb94f765f918aaaeeca716935f940deb85403d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:17Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:17 crc kubenswrapper[4675]: I0216 19:43:17.948758 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:17 crc kubenswrapper[4675]: I0216 19:43:17.948787 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:17 crc kubenswrapper[4675]: I0216 19:43:17.948798 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:17 crc kubenswrapper[4675]: I0216 19:43:17.948815 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:17 crc kubenswrapper[4675]: I0216 19:43:17.948827 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:17Z","lastTransitionTime":"2026-02-16T19:43:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:17 crc kubenswrapper[4675]: I0216 19:43:17.967043 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:17Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:17 crc kubenswrapper[4675]: I0216 19:43:17.992380 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"358c3bab-4666-44d1-90fb-3affa22327e2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e0dcd3707a91b9a3869942408c144f0b4ca22cd64e99de5fe43f95094e44a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d3a4d507eb459edf9f2a3c58417ae320eba85a7b597808fdb0eabe7d7d5afbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd404f7efe7c09d47f0550b7cf66ac7193f315f60db74cd5609593546037c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bfda6b7519474256ab63750ba86955592a2e47a0e364e4170a19fe28270459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e9a715621e072260b3d9703159db5eaad4918ff7a2cb7689dcb914ccd617950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e8bfd47afcf3bbd1831b98e69a3ee5fe7b49dd04d4f2efc9d9dc8fda831c74a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e8bfd47afcf3bbd1831b98e69a3ee5fe7b49dd04d4f2efc9d9dc8fda831c74a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65752e9eeab5fea26d84c7d43a33947d3dbabdd31578480904db24d60d298a0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65752e9eeab5fea26d84c7d43a33947d3dbabdd31578480904db24d60d298a0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bd389e534c9b05c81db5cfb0a9f1f5e78e2f1ccbcce15953172836fb3fb6744f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd389e534c9b05c81db5cfb0a9f1f5e78e2f1ccbcce15953172836fb3fb6744f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:17Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:18 crc kubenswrapper[4675]: I0216 19:43:18.009386 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3824464b-e8c8-4a8a-a50b-0ba14ad23d54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1eebdfea2b981083cc06feead15d85cc403b59837948716edf1a4649aff1bfc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eea7a27c1d937fcd2e1879b3eb16aacaf9733618822a2064c003a0e69828c4e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53ef6384f8ec4cb5976952d83be543b5a555d2ebcbe603a87889791803b1ae3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb0650dde5c6086d91ae092a0550467170f105f67c5cab5b0789f3c21d0c6966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb0650dde5c6086d91ae092a0550467170f105f67c5cab5b0789f3c21d0c6966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:18Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:18 crc kubenswrapper[4675]: I0216 19:43:18.031003 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18c7d7a1f00611bdd6074314992cb6601ce0c043b831b6125010fe098c7e12ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:18Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:18 crc kubenswrapper[4675]: I0216 19:43:18.048881 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:18Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:18 crc kubenswrapper[4675]: I0216 19:43:18.052044 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:18 crc kubenswrapper[4675]: I0216 19:43:18.052104 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:18 crc kubenswrapper[4675]: I0216 19:43:18.052119 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:18 crc kubenswrapper[4675]: I0216 19:43:18.052147 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:18 crc kubenswrapper[4675]: I0216 19:43:18.052166 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:18Z","lastTransitionTime":"2026-02-16T19:43:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:18 crc kubenswrapper[4675]: I0216 19:43:18.061476 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:18Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:18 crc kubenswrapper[4675]: I0216 19:43:18.076275 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pj5xg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9a99563-d631-455f-8464-160e5619c610\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:43:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:43:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e43206e68596f3aac0d8c5d9c093d1b5aa57306577c54fc003ab051950ceeac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://798b2a0b186213c744675eb2bd1a33fcb05a8df10795373dbff1f91222fa17a9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-16T19:43:11Z\\\",\\\"message\\\":\\\"2026-02-16T19:42:26+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9b0c1c65-b0b3-415a-8e3c-d91f805b2976\\\\n2026-02-16T19:42:26+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9b0c1c65-b0b3-415a-8e3c-d91f805b2976 to /host/opt/cni/bin/\\\\n2026-02-16T19:42:26Z [verbose] multus-daemon started\\\\n2026-02-16T19:42:26Z [verbose] Readiness Indicator file check\\\\n2026-02-16T19:43:11Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:43:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s94dl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pj5xg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:18Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:18 crc kubenswrapper[4675]: I0216 19:43:18.104577 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://958d71da673d02ed4c067f390a0a94c9f20052f6274ec975f9a4c3034efa7aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5282559b57497e15c0bd828702b4994e8d05f7c1a81ee44e582ce3bb3b64cbb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13af49ba7abbe43edde46130d212f6d76f30bab85a63aef18a84721887e29535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e22795f3990f90b901e1ba026e5f95e72508dc865635d19ba881a291dbaed85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9296772284fdb3e0ac56ffd5e54f10e7fe1256ccb616d52f103479ebcc6e81f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ace6ee814e8e429b6ec77e88a0bb2fff35cfbfd3497e7a4d7882e11f437ce5a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5452aa74eda5605a3b5ad24d32c48b20ec9f2c479d5dd388e4307b79245e8f4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5452aa74eda5605a3b5ad24d32c48b20ec9f2c479d5dd388e4307b79245e8f4b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-16T19:42:55Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0216 19:42:55.399413 6340 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0216 19:42:55.399442 6340 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0216 19:42:55.399468 6340 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0216 19:42:55.399515 6340 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0216 19:42:55.399545 6340 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0216 19:42:55.399552 6340 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0216 19:42:55.399802 6340 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0216 19:42:55.399864 6340 factory.go:656] Stopping watch factory\\\\nI0216 19:42:55.399912 6340 ovnkube.go:599] Stopped ovnkube\\\\nI0216 19:42:55.399629 6340 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0216 19:42:55.399637 6340 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0216 19:42:55.399654 6340 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0216 19:42:55.399993 6340 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0216 19:42:55.400004 6340 handler.go:208] Removed *v1.Node event handler 7\\\\nI0216 19:42:55.400012 6340 handler.go:208] Removed *v1.Node event handler 2\\\\nI0216 19:42:5\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-gpc5z_openshift-ovn-kubernetes(9b6e2d5a-0472-425b-b5b4-0b94f14ebfba)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4a449e69885bb506900d7dbb14ee3ff9eb2d1188267babb04fbe0bb8f057c48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gpc5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:18Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:18 crc kubenswrapper[4675]: I0216 19:43:18.120827 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10414964-83d0-4d95-a89f-e3212a8015b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db3c34dee6cb538e5de05d8703350e3013401589489756aafbf8fed2ef597d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27l4j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92e33c01bc9214841f42d4ca6e58fb062a9e94fe39bdc5ad08ad2351cd270058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27l4j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j7pnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:18Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:18 crc kubenswrapper[4675]: I0216 19:43:18.138319 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc6bac98-79c6-4970-ba8b-7996a4046f7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055f2d1052fed3e5e8d660243c3d0595e3168a670ea7a22ef51b199528ffe826\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beefddcc0b5e2e84c15063f467b9043ba9caea11fbc3f7fb8fc1ece3b8c8f75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dbd2e87a95c7964a1bafb765ec76d7f18e180cc0e47df9fde7b526b3717a107\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cc4f98152f284333d41071adaefada973e3396c3cdb2c6ee3c662f0043fe65b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c762d1facdba3325880ae6b1e2704c626ae67915808b527959795774e2a1ba62\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-16T19:42:02Z\\\",\\\"message\\\":\\\"W0216 19:42:01.228572 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0216 19:42:01.228958 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771270921 cert, and key in /tmp/serving-cert-1169484463/serving-signer.crt, /tmp/serving-cert-1169484463/serving-signer.key\\\\nI0216 19:42:01.683881 1 observer_polling.go:159] Starting file observer\\\\nW0216 19:42:01.687387 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0216 19:42:01.693158 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0216 19:42:01.694799 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1169484463/tls.crt::/tmp/serving-cert-1169484463/tls.key\\\\\\\"\\\\nF0216 19:42:02.048422 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://344f8bd7bee340cc040a2556a0e92c055ed7fbad886f85750b66d21ace766e8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:18Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:18 crc kubenswrapper[4675]: I0216 19:43:18.153669 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4vvwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2de9472b-0e23-4821-86b8-202bfd739aee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2a601bdf26e518854cf3a55f3aed6274701bc40b20fe0f0e7673e65a007f0b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqw54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4vvwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:18Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:18 crc kubenswrapper[4675]: I0216 19:43:18.155029 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:18 crc kubenswrapper[4675]: I0216 19:43:18.155084 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:18 crc kubenswrapper[4675]: I0216 19:43:18.155095 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:18 crc kubenswrapper[4675]: I0216 19:43:18.155114 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:18 crc kubenswrapper[4675]: I0216 19:43:18.155127 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:18Z","lastTransitionTime":"2026-02-16T19:43:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:18 crc kubenswrapper[4675]: I0216 19:43:18.174079 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rg8bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f385df3d-0543-4189-88a6-2163c8b9b959\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f034148fa1b45c35dd6c8b9b70c08b1d745d1377335d7f6d2b35793c29ce9e03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://993eb6f1a848a5279ad2e5d47f47c873918dbca13d2ff7db80bebbad6f948476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://993eb6f1a848a5279ad2e5d47f47c873918dbca13d2ff7db80bebbad6f948476\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02bb51b679cec1e9962a9d52b4fa890f31df7d9c00ed9f4885e94ac52455341c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02bb51b679cec1e9962a9d52b4fa890f31df7d9c00ed9f4885e94ac52455341c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0161d628c911b5e33369708f774f6f8313fb74d7fdc1266b7bfa346d5d6200a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0161d628c911b5e33369708f774f6f8313fb74d7fdc1266b7bfa346d5d6200a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://470da14b61b6a4b60e151726b8a40431c5a38f30416cb9e635e843541e8b28bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://470da14b61b6a4b60e151726b8a40431c5a38f30416cb9e635e843541e8b28bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd74b2f78bf4719b66ec38da6783f0e4ee8b93d27d6c27907212e906c1cd8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbd74b2f78bf4719b66ec38da6783f0e4ee8b93d27d6c27907212e906c1cd8e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f742c6f36d6a537e143339eaa367ff65398f680da2e175aeaed1a4da1886148c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f742c6f36d6a537e143339eaa367ff65398f680da2e175aeaed1a4da1886148c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rg8bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:18Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:18 crc kubenswrapper[4675]: I0216 19:43:18.188516 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04c59a1a3ae53156b955a90113bc3d3b4424b7dcc3baa91e8256fd0893751af8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:18Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:18 crc kubenswrapper[4675]: I0216 19:43:18.201243 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sbgjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d5a5b47-38a4-4f7e-b40e-dba4825e18be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm4b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm4b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sbgjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:18Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:18 crc kubenswrapper[4675]: I0216 19:43:18.216598 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"776ad1bc-395c-41c3-8c95-37a58a8cd4e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04dbf4934ecd5c1773fe7c1d032d68d59f41556af022bdf1c36f8de85221616a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abb4dfa515be84263cef6de8c300018193409908292cbfe0d19610ade9532ad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8d3160c906648c15ea910b1b2fa9223c716bbefac62b25bba4f83f3b50217b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd0dd7a8d986a0dd78568e33afa36fcd8bba88cf885d29b22484132bf0034c47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:18Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:18 crc kubenswrapper[4675]: I0216 19:43:18.259524 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:18 crc kubenswrapper[4675]: I0216 19:43:18.259584 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:18 crc kubenswrapper[4675]: I0216 19:43:18.259597 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:18 crc kubenswrapper[4675]: I0216 19:43:18.259619 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:18 crc kubenswrapper[4675]: I0216 19:43:18.259633 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:18Z","lastTransitionTime":"2026-02-16T19:43:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:18 crc kubenswrapper[4675]: I0216 19:43:18.363682 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:18 crc kubenswrapper[4675]: I0216 19:43:18.363757 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:18 crc kubenswrapper[4675]: I0216 19:43:18.363774 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:18 crc kubenswrapper[4675]: I0216 19:43:18.363799 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:18 crc kubenswrapper[4675]: I0216 19:43:18.363815 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:18Z","lastTransitionTime":"2026-02-16T19:43:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:18 crc kubenswrapper[4675]: I0216 19:43:18.467000 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:18 crc kubenswrapper[4675]: I0216 19:43:18.467052 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:18 crc kubenswrapper[4675]: I0216 19:43:18.467062 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:18 crc kubenswrapper[4675]: I0216 19:43:18.467080 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:18 crc kubenswrapper[4675]: I0216 19:43:18.467091 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:18Z","lastTransitionTime":"2026-02-16T19:43:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:18 crc kubenswrapper[4675]: I0216 19:43:18.570279 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:18 crc kubenswrapper[4675]: I0216 19:43:18.570343 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:18 crc kubenswrapper[4675]: I0216 19:43:18.570358 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:18 crc kubenswrapper[4675]: I0216 19:43:18.570378 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:18 crc kubenswrapper[4675]: I0216 19:43:18.570394 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:18Z","lastTransitionTime":"2026-02-16T19:43:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:18 crc kubenswrapper[4675]: I0216 19:43:18.681079 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:18 crc kubenswrapper[4675]: I0216 19:43:18.681128 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:18 crc kubenswrapper[4675]: I0216 19:43:18.681141 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:18 crc kubenswrapper[4675]: I0216 19:43:18.681162 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:18 crc kubenswrapper[4675]: I0216 19:43:18.681175 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:18Z","lastTransitionTime":"2026-02-16T19:43:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:18 crc kubenswrapper[4675]: I0216 19:43:18.784226 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:18 crc kubenswrapper[4675]: I0216 19:43:18.784303 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:18 crc kubenswrapper[4675]: I0216 19:43:18.784322 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:18 crc kubenswrapper[4675]: I0216 19:43:18.784349 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:18 crc kubenswrapper[4675]: I0216 19:43:18.784367 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:18Z","lastTransitionTime":"2026-02-16T19:43:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:18 crc kubenswrapper[4675]: I0216 19:43:18.845662 4675 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 12:52:36.686026165 +0000 UTC Feb 16 19:43:18 crc kubenswrapper[4675]: I0216 19:43:18.883588 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbgjb" Feb 16 19:43:18 crc kubenswrapper[4675]: I0216 19:43:18.883629 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 19:43:18 crc kubenswrapper[4675]: I0216 19:43:18.883724 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 19:43:18 crc kubenswrapper[4675]: E0216 19:43:18.883860 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbgjb" podUID="8d5a5b47-38a4-4f7e-b40e-dba4825e18be" Feb 16 19:43:18 crc kubenswrapper[4675]: I0216 19:43:18.883912 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 19:43:18 crc kubenswrapper[4675]: E0216 19:43:18.884254 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 19:43:18 crc kubenswrapper[4675]: E0216 19:43:18.884410 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 19:43:18 crc kubenswrapper[4675]: E0216 19:43:18.884515 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 19:43:18 crc kubenswrapper[4675]: I0216 19:43:18.887328 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:18 crc kubenswrapper[4675]: I0216 19:43:18.887387 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:18 crc kubenswrapper[4675]: I0216 19:43:18.887411 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:18 crc kubenswrapper[4675]: I0216 19:43:18.887440 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:18 crc kubenswrapper[4675]: I0216 19:43:18.887464 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:18Z","lastTransitionTime":"2026-02-16T19:43:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:18 crc kubenswrapper[4675]: I0216 19:43:18.903040 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Feb 16 19:43:18 crc kubenswrapper[4675]: I0216 19:43:18.991494 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:18 crc kubenswrapper[4675]: I0216 19:43:18.991555 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:18 crc kubenswrapper[4675]: I0216 19:43:18.991572 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:18 crc kubenswrapper[4675]: I0216 19:43:18.991596 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:18 crc kubenswrapper[4675]: I0216 19:43:18.991614 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:18Z","lastTransitionTime":"2026-02-16T19:43:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:19 crc kubenswrapper[4675]: I0216 19:43:19.094670 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:19 crc kubenswrapper[4675]: I0216 19:43:19.094728 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:19 crc kubenswrapper[4675]: I0216 19:43:19.094737 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:19 crc kubenswrapper[4675]: I0216 19:43:19.094754 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:19 crc kubenswrapper[4675]: I0216 19:43:19.094765 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:19Z","lastTransitionTime":"2026-02-16T19:43:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:19 crc kubenswrapper[4675]: I0216 19:43:19.196775 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:19 crc kubenswrapper[4675]: I0216 19:43:19.196836 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:19 crc kubenswrapper[4675]: I0216 19:43:19.196856 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:19 crc kubenswrapper[4675]: I0216 19:43:19.196878 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:19 crc kubenswrapper[4675]: I0216 19:43:19.196892 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:19Z","lastTransitionTime":"2026-02-16T19:43:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:19 crc kubenswrapper[4675]: I0216 19:43:19.300260 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:19 crc kubenswrapper[4675]: I0216 19:43:19.300340 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:19 crc kubenswrapper[4675]: I0216 19:43:19.300362 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:19 crc kubenswrapper[4675]: I0216 19:43:19.300393 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:19 crc kubenswrapper[4675]: I0216 19:43:19.300412 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:19Z","lastTransitionTime":"2026-02-16T19:43:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:19 crc kubenswrapper[4675]: I0216 19:43:19.403674 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:19 crc kubenswrapper[4675]: I0216 19:43:19.403797 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:19 crc kubenswrapper[4675]: I0216 19:43:19.403824 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:19 crc kubenswrapper[4675]: I0216 19:43:19.403859 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:19 crc kubenswrapper[4675]: I0216 19:43:19.403877 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:19Z","lastTransitionTime":"2026-02-16T19:43:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:19 crc kubenswrapper[4675]: I0216 19:43:19.508328 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:19 crc kubenswrapper[4675]: I0216 19:43:19.508403 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:19 crc kubenswrapper[4675]: I0216 19:43:19.508427 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:19 crc kubenswrapper[4675]: I0216 19:43:19.508456 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:19 crc kubenswrapper[4675]: I0216 19:43:19.508498 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:19Z","lastTransitionTime":"2026-02-16T19:43:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:19 crc kubenswrapper[4675]: I0216 19:43:19.611814 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:19 crc kubenswrapper[4675]: I0216 19:43:19.611856 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:19 crc kubenswrapper[4675]: I0216 19:43:19.611865 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:19 crc kubenswrapper[4675]: I0216 19:43:19.611881 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:19 crc kubenswrapper[4675]: I0216 19:43:19.611893 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:19Z","lastTransitionTime":"2026-02-16T19:43:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:19 crc kubenswrapper[4675]: I0216 19:43:19.716336 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:19 crc kubenswrapper[4675]: I0216 19:43:19.716395 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:19 crc kubenswrapper[4675]: I0216 19:43:19.716419 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:19 crc kubenswrapper[4675]: I0216 19:43:19.716445 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:19 crc kubenswrapper[4675]: I0216 19:43:19.716462 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:19Z","lastTransitionTime":"2026-02-16T19:43:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:19 crc kubenswrapper[4675]: I0216 19:43:19.819276 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:19 crc kubenswrapper[4675]: I0216 19:43:19.819333 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:19 crc kubenswrapper[4675]: I0216 19:43:19.819346 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:19 crc kubenswrapper[4675]: I0216 19:43:19.819368 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:19 crc kubenswrapper[4675]: I0216 19:43:19.819384 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:19Z","lastTransitionTime":"2026-02-16T19:43:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:19 crc kubenswrapper[4675]: I0216 19:43:19.846830 4675 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 22:11:07.478574179 +0000 UTC Feb 16 19:43:19 crc kubenswrapper[4675]: I0216 19:43:19.922875 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:19 crc kubenswrapper[4675]: I0216 19:43:19.922911 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:19 crc kubenswrapper[4675]: I0216 19:43:19.922921 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:19 crc kubenswrapper[4675]: I0216 19:43:19.922934 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:19 crc kubenswrapper[4675]: I0216 19:43:19.922945 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:19Z","lastTransitionTime":"2026-02-16T19:43:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:20 crc kubenswrapper[4675]: I0216 19:43:20.026542 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:20 crc kubenswrapper[4675]: I0216 19:43:20.026611 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:20 crc kubenswrapper[4675]: I0216 19:43:20.026629 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:20 crc kubenswrapper[4675]: I0216 19:43:20.026654 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:20 crc kubenswrapper[4675]: I0216 19:43:20.026673 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:20Z","lastTransitionTime":"2026-02-16T19:43:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:20 crc kubenswrapper[4675]: I0216 19:43:20.130310 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:20 crc kubenswrapper[4675]: I0216 19:43:20.130371 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:20 crc kubenswrapper[4675]: I0216 19:43:20.130385 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:20 crc kubenswrapper[4675]: I0216 19:43:20.130406 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:20 crc kubenswrapper[4675]: I0216 19:43:20.130423 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:20Z","lastTransitionTime":"2026-02-16T19:43:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:20 crc kubenswrapper[4675]: I0216 19:43:20.233224 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:20 crc kubenswrapper[4675]: I0216 19:43:20.233300 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:20 crc kubenswrapper[4675]: I0216 19:43:20.233319 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:20 crc kubenswrapper[4675]: I0216 19:43:20.233347 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:20 crc kubenswrapper[4675]: I0216 19:43:20.233380 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:20Z","lastTransitionTime":"2026-02-16T19:43:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:20 crc kubenswrapper[4675]: I0216 19:43:20.336151 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:20 crc kubenswrapper[4675]: I0216 19:43:20.336221 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:20 crc kubenswrapper[4675]: I0216 19:43:20.336240 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:20 crc kubenswrapper[4675]: I0216 19:43:20.336267 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:20 crc kubenswrapper[4675]: I0216 19:43:20.336284 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:20Z","lastTransitionTime":"2026-02-16T19:43:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:20 crc kubenswrapper[4675]: I0216 19:43:20.440008 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:20 crc kubenswrapper[4675]: I0216 19:43:20.440096 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:20 crc kubenswrapper[4675]: I0216 19:43:20.440123 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:20 crc kubenswrapper[4675]: I0216 19:43:20.440190 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:20 crc kubenswrapper[4675]: I0216 19:43:20.440216 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:20Z","lastTransitionTime":"2026-02-16T19:43:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:20 crc kubenswrapper[4675]: I0216 19:43:20.543593 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:20 crc kubenswrapper[4675]: I0216 19:43:20.543670 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:20 crc kubenswrapper[4675]: I0216 19:43:20.543749 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:20 crc kubenswrapper[4675]: I0216 19:43:20.543779 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:20 crc kubenswrapper[4675]: I0216 19:43:20.543798 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:20Z","lastTransitionTime":"2026-02-16T19:43:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:20 crc kubenswrapper[4675]: I0216 19:43:20.646618 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:20 crc kubenswrapper[4675]: I0216 19:43:20.646675 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:20 crc kubenswrapper[4675]: I0216 19:43:20.646712 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:20 crc kubenswrapper[4675]: I0216 19:43:20.646730 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:20 crc kubenswrapper[4675]: I0216 19:43:20.646742 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:20Z","lastTransitionTime":"2026-02-16T19:43:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:20 crc kubenswrapper[4675]: I0216 19:43:20.750584 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:20 crc kubenswrapper[4675]: I0216 19:43:20.750675 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:20 crc kubenswrapper[4675]: I0216 19:43:20.750759 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:20 crc kubenswrapper[4675]: I0216 19:43:20.750785 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:20 crc kubenswrapper[4675]: I0216 19:43:20.750821 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:20Z","lastTransitionTime":"2026-02-16T19:43:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:20 crc kubenswrapper[4675]: I0216 19:43:20.848033 4675 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 21:58:39.151294352 +0000 UTC Feb 16 19:43:20 crc kubenswrapper[4675]: I0216 19:43:20.854123 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:20 crc kubenswrapper[4675]: I0216 19:43:20.854332 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:20 crc kubenswrapper[4675]: I0216 19:43:20.854419 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:20 crc kubenswrapper[4675]: I0216 19:43:20.854512 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:20 crc kubenswrapper[4675]: I0216 19:43:20.854583 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:20Z","lastTransitionTime":"2026-02-16T19:43:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:20 crc kubenswrapper[4675]: I0216 19:43:20.883746 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 19:43:20 crc kubenswrapper[4675]: I0216 19:43:20.883808 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 19:43:20 crc kubenswrapper[4675]: I0216 19:43:20.883769 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 19:43:20 crc kubenswrapper[4675]: I0216 19:43:20.883773 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbgjb" Feb 16 19:43:20 crc kubenswrapper[4675]: E0216 19:43:20.883908 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 19:43:20 crc kubenswrapper[4675]: E0216 19:43:20.884028 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 19:43:20 crc kubenswrapper[4675]: E0216 19:43:20.884174 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbgjb" podUID="8d5a5b47-38a4-4f7e-b40e-dba4825e18be" Feb 16 19:43:20 crc kubenswrapper[4675]: E0216 19:43:20.884271 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 19:43:20 crc kubenswrapper[4675]: I0216 19:43:20.957158 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:20 crc kubenswrapper[4675]: I0216 19:43:20.957211 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:20 crc kubenswrapper[4675]: I0216 19:43:20.957228 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:20 crc kubenswrapper[4675]: I0216 19:43:20.957254 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:20 crc kubenswrapper[4675]: I0216 19:43:20.957274 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:20Z","lastTransitionTime":"2026-02-16T19:43:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:21 crc kubenswrapper[4675]: I0216 19:43:21.060786 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:21 crc kubenswrapper[4675]: I0216 19:43:21.060881 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:21 crc kubenswrapper[4675]: I0216 19:43:21.060906 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:21 crc kubenswrapper[4675]: I0216 19:43:21.060943 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:21 crc kubenswrapper[4675]: I0216 19:43:21.060966 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:21Z","lastTransitionTime":"2026-02-16T19:43:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:21 crc kubenswrapper[4675]: I0216 19:43:21.164349 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:21 crc kubenswrapper[4675]: I0216 19:43:21.165923 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:21 crc kubenswrapper[4675]: I0216 19:43:21.165956 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:21 crc kubenswrapper[4675]: I0216 19:43:21.165982 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:21 crc kubenswrapper[4675]: I0216 19:43:21.166003 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:21Z","lastTransitionTime":"2026-02-16T19:43:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:21 crc kubenswrapper[4675]: I0216 19:43:21.269353 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:21 crc kubenswrapper[4675]: I0216 19:43:21.269413 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:21 crc kubenswrapper[4675]: I0216 19:43:21.269432 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:21 crc kubenswrapper[4675]: I0216 19:43:21.269456 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:21 crc kubenswrapper[4675]: I0216 19:43:21.269470 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:21Z","lastTransitionTime":"2026-02-16T19:43:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:21 crc kubenswrapper[4675]: I0216 19:43:21.372268 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:21 crc kubenswrapper[4675]: I0216 19:43:21.372343 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:21 crc kubenswrapper[4675]: I0216 19:43:21.372360 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:21 crc kubenswrapper[4675]: I0216 19:43:21.372387 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:21 crc kubenswrapper[4675]: I0216 19:43:21.372409 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:21Z","lastTransitionTime":"2026-02-16T19:43:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:21 crc kubenswrapper[4675]: I0216 19:43:21.475253 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:21 crc kubenswrapper[4675]: I0216 19:43:21.475359 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:21 crc kubenswrapper[4675]: I0216 19:43:21.475381 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:21 crc kubenswrapper[4675]: I0216 19:43:21.475417 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:21 crc kubenswrapper[4675]: I0216 19:43:21.475442 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:21Z","lastTransitionTime":"2026-02-16T19:43:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:21 crc kubenswrapper[4675]: I0216 19:43:21.578720 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:21 crc kubenswrapper[4675]: I0216 19:43:21.578795 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:21 crc kubenswrapper[4675]: I0216 19:43:21.578870 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:21 crc kubenswrapper[4675]: I0216 19:43:21.578903 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:21 crc kubenswrapper[4675]: I0216 19:43:21.578927 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:21Z","lastTransitionTime":"2026-02-16T19:43:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:21 crc kubenswrapper[4675]: I0216 19:43:21.681634 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:21 crc kubenswrapper[4675]: I0216 19:43:21.681916 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:21 crc kubenswrapper[4675]: I0216 19:43:21.681932 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:21 crc kubenswrapper[4675]: I0216 19:43:21.681952 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:21 crc kubenswrapper[4675]: I0216 19:43:21.681968 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:21Z","lastTransitionTime":"2026-02-16T19:43:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:21 crc kubenswrapper[4675]: I0216 19:43:21.785753 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:21 crc kubenswrapper[4675]: I0216 19:43:21.785822 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:21 crc kubenswrapper[4675]: I0216 19:43:21.785843 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:21 crc kubenswrapper[4675]: I0216 19:43:21.785870 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:21 crc kubenswrapper[4675]: I0216 19:43:21.785890 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:21Z","lastTransitionTime":"2026-02-16T19:43:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:21 crc kubenswrapper[4675]: I0216 19:43:21.848560 4675 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 14:59:31.689337856 +0000 UTC Feb 16 19:43:21 crc kubenswrapper[4675]: I0216 19:43:21.885521 4675 scope.go:117] "RemoveContainer" containerID="5452aa74eda5605a3b5ad24d32c48b20ec9f2c479d5dd388e4307b79245e8f4b" Feb 16 19:43:21 crc kubenswrapper[4675]: I0216 19:43:21.889030 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:21 crc kubenswrapper[4675]: I0216 19:43:21.889084 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:21 crc kubenswrapper[4675]: I0216 19:43:21.889104 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:21 crc kubenswrapper[4675]: I0216 19:43:21.889132 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:21 crc kubenswrapper[4675]: I0216 19:43:21.889154 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:21Z","lastTransitionTime":"2026-02-16T19:43:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:21 crc kubenswrapper[4675]: I0216 19:43:21.992818 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:21 crc kubenswrapper[4675]: I0216 19:43:21.992889 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:21 crc kubenswrapper[4675]: I0216 19:43:21.992912 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:21 crc kubenswrapper[4675]: I0216 19:43:21.992939 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:21 crc kubenswrapper[4675]: I0216 19:43:21.992958 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:21Z","lastTransitionTime":"2026-02-16T19:43:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:22 crc kubenswrapper[4675]: I0216 19:43:22.096337 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:22 crc kubenswrapper[4675]: I0216 19:43:22.096372 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:22 crc kubenswrapper[4675]: I0216 19:43:22.096384 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:22 crc kubenswrapper[4675]: I0216 19:43:22.096401 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:22 crc kubenswrapper[4675]: I0216 19:43:22.096412 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:22Z","lastTransitionTime":"2026-02-16T19:43:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:22 crc kubenswrapper[4675]: I0216 19:43:22.199412 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:22 crc kubenswrapper[4675]: I0216 19:43:22.199477 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:22 crc kubenswrapper[4675]: I0216 19:43:22.199500 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:22 crc kubenswrapper[4675]: I0216 19:43:22.199531 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:22 crc kubenswrapper[4675]: I0216 19:43:22.199556 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:22Z","lastTransitionTime":"2026-02-16T19:43:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:22 crc kubenswrapper[4675]: I0216 19:43:22.302982 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:22 crc kubenswrapper[4675]: I0216 19:43:22.303048 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:22 crc kubenswrapper[4675]: I0216 19:43:22.303068 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:22 crc kubenswrapper[4675]: I0216 19:43:22.303092 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:22 crc kubenswrapper[4675]: I0216 19:43:22.303117 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:22Z","lastTransitionTime":"2026-02-16T19:43:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:22 crc kubenswrapper[4675]: I0216 19:43:22.405887 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:22 crc kubenswrapper[4675]: I0216 19:43:22.405951 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:22 crc kubenswrapper[4675]: I0216 19:43:22.405964 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:22 crc kubenswrapper[4675]: I0216 19:43:22.405983 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:22 crc kubenswrapper[4675]: I0216 19:43:22.405994 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:22Z","lastTransitionTime":"2026-02-16T19:43:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:22 crc kubenswrapper[4675]: I0216 19:43:22.440509 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gpc5z_9b6e2d5a-0472-425b-b5b4-0b94f14ebfba/ovnkube-controller/2.log" Feb 16 19:43:22 crc kubenswrapper[4675]: I0216 19:43:22.443038 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" event={"ID":"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba","Type":"ContainerStarted","Data":"07781eeef14d0ee02a533c5f1b5b7bdb9fa735d2a02636e9671605e69e6a932c"} Feb 16 19:43:22 crc kubenswrapper[4675]: I0216 19:43:22.443593 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" Feb 16 19:43:22 crc kubenswrapper[4675]: I0216 19:43:22.459765 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10e2a1727464113d2cd88dfb8ba52bb1944b3dfe67a0d40f5b2e57fe483e5890\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83a4afeccd65f013b07c0f90cbb94f765f918aaaeeca716935f940deb85403d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:22Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:22 crc kubenswrapper[4675]: I0216 19:43:22.479890 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-stxk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af599569-feb0-432a-9adf-1b72d1ac1a57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5944f34bb922394197cfb79978d19139f70724115954043828e9f609ca32945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-stxk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:22Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:22 crc kubenswrapper[4675]: I0216 19:43:22.501752 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h9s8p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e98c1ed8-cbed-4d30-b3fc-41beff5d65c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://410f4deab4b31698f80da30412dff846ca9f1004dfab5d4cfe67c6892b3a7aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df2gr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20a2a1cc0b69b60d38a38ab898236d8577ce735cd2e1698f6fd3cb3d3c23b62c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df2gr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-h9s8p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:22Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:22 crc kubenswrapper[4675]: I0216 19:43:22.508033 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:22 crc kubenswrapper[4675]: I0216 19:43:22.508107 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:22 crc kubenswrapper[4675]: I0216 19:43:22.508128 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:22 crc kubenswrapper[4675]: I0216 19:43:22.508157 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:22 crc kubenswrapper[4675]: I0216 19:43:22.508177 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:22Z","lastTransitionTime":"2026-02-16T19:43:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:22 crc kubenswrapper[4675]: I0216 19:43:22.543249 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"358c3bab-4666-44d1-90fb-3affa22327e2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e0dcd3707a91b9a3869942408c144f0b4ca22cd64e99de5fe43f95094e44a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d3a4d507eb459edf9f2a3c58417ae320eba85a7b597808fdb0eabe7d7d5afbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd404f7efe7c09d47f0550b7cf66ac7193f315f60db74cd5609593546037c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bfda6b7519474256ab63750ba86955592a2e47a0e364e4170a19fe28270459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e9a715621e072260b3d9703159db5eaad4918ff7a2cb7689dcb914ccd617950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e8bfd47afcf3bbd1831b98e69a3ee5fe7b49dd04d4f2efc9d9dc8fda831c74a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e8bfd47afcf3bbd1831b98e69a3ee5fe7b49dd04d4f2efc9d9dc8fda831c74a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65752e9eeab5fea26d84c7d43a33947d3dbabdd31578480904db24d60d298a0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65752e9eeab5fea26d84c7d43a33947d3dbabdd31578480904db24d60d298a0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bd389e534c9b05c81db5cfb0a9f1f5e78e2f1ccbcce15953172836fb3fb6744f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd389e534c9b05c81db5cfb0a9f1f5e78e2f1ccbcce15953172836fb3fb6744f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:22Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:22 crc kubenswrapper[4675]: I0216 19:43:22.558632 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dff80f0-fe7f-4d8c-9c10-e2775f4d5ae8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d44d6cbe740841967325282ecae8c05f9bf0fc4620d33b5a3da13cee2f1450a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17d4432915444ec16f61697928fa82b49a78aac329f5777d2db8681ae8e2f709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17d4432915444ec16f61697928fa82b49a78aac329f5777d2db8681ae8e2f709\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:22Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:22 crc kubenswrapper[4675]: I0216 19:43:22.577071 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:22Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:22 crc kubenswrapper[4675]: I0216 19:43:22.595143 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:22Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:22 crc kubenswrapper[4675]: I0216 19:43:22.600977 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 19:43:22 crc kubenswrapper[4675]: I0216 19:43:22.601039 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 19:43:22 crc kubenswrapper[4675]: E0216 19:43:22.601129 4675 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 16 19:43:22 crc kubenswrapper[4675]: E0216 19:43:22.601158 4675 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 16 19:43:22 crc kubenswrapper[4675]: E0216 19:43:22.601194 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-16 19:44:26.601178058 +0000 UTC m=+149.726467614 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 16 19:43:22 crc kubenswrapper[4675]: E0216 19:43:22.601211 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-16 19:44:26.601203589 +0000 UTC m=+149.726493145 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 16 19:43:22 crc kubenswrapper[4675]: I0216 19:43:22.609558 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pj5xg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9a99563-d631-455f-8464-160e5619c610\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:43:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:43:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e43206e68596f3aac0d8c5d9c093d1b5aa57306577c54fc003ab051950ceeac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://798b2a0b186213c744675eb2bd1a33fcb05a8df10795373dbff1f91222fa17a9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-16T19:43:11Z\\\",\\\"message\\\":\\\"2026-02-16T19:42:26+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9b0c1c65-b0b3-415a-8e3c-d91f805b2976\\\\n2026-02-16T19:42:26+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9b0c1c65-b0b3-415a-8e3c-d91f805b2976 to /host/opt/cni/bin/\\\\n2026-02-16T19:42:26Z [verbose] multus-daemon started\\\\n2026-02-16T19:42:26Z [verbose] Readiness Indicator file check\\\\n2026-02-16T19:43:11Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:43:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s94dl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pj5xg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:22Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:22 crc kubenswrapper[4675]: I0216 19:43:22.610902 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:22 crc kubenswrapper[4675]: I0216 19:43:22.610940 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:22 crc kubenswrapper[4675]: I0216 19:43:22.610952 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:22 crc kubenswrapper[4675]: I0216 19:43:22.610967 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:22 crc kubenswrapper[4675]: I0216 19:43:22.610981 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:22Z","lastTransitionTime":"2026-02-16T19:43:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:22 crc kubenswrapper[4675]: I0216 19:43:22.631055 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://958d71da673d02ed4c067f390a0a94c9f20052f6274ec975f9a4c3034efa7aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5282559b57497e15c0bd828702b4994e8d05f7c1a81ee44e582ce3bb3b64cbb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13af49ba7abbe43edde46130d212f6d76f30bab85a63aef18a84721887e29535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e22795f3990f90b901e1ba026e5f95e72508dc865635d19ba881a291dbaed85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9296772284fdb3e0ac56ffd5e54f10e7fe1256ccb616d52f103479ebcc6e81f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ace6ee814e8e429b6ec77e88a0bb2fff35cfbfd3497e7a4d7882e11f437ce5a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07781eeef14d0ee02a533c5f1b5b7bdb9fa735d2a02636e9671605e69e6a932c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5452aa74eda5605a3b5ad24d32c48b20ec9f2c479d5dd388e4307b79245e8f4b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-16T19:42:55Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0216 19:42:55.399413 6340 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0216 19:42:55.399442 6340 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0216 19:42:55.399468 6340 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0216 19:42:55.399515 6340 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0216 19:42:55.399545 6340 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0216 19:42:55.399552 6340 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0216 19:42:55.399802 6340 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0216 19:42:55.399864 6340 factory.go:656] Stopping watch factory\\\\nI0216 19:42:55.399912 6340 ovnkube.go:599] Stopped ovnkube\\\\nI0216 19:42:55.399629 6340 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0216 19:42:55.399637 6340 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0216 19:42:55.399654 6340 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0216 19:42:55.399993 6340 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0216 19:42:55.400004 6340 handler.go:208] Removed *v1.Node event handler 7\\\\nI0216 19:42:55.400012 6340 handler.go:208] Removed *v1.Node event handler 2\\\\nI0216 19:42:5\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4a449e69885bb506900d7dbb14ee3ff9eb2d1188267babb04fbe0bb8f057c48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gpc5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:22Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:22 crc kubenswrapper[4675]: I0216 19:43:22.649196 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10414964-83d0-4d95-a89f-e3212a8015b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db3c34dee6cb538e5de05d8703350e3013401589489756aafbf8fed2ef597d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27l4j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92e33c01bc9214841f42d4ca6e58fb062a9e94fe39bdc5ad08ad2351cd270058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27l4j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j7pnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:22Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:22 crc kubenswrapper[4675]: I0216 19:43:22.667085 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc6bac98-79c6-4970-ba8b-7996a4046f7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055f2d1052fed3e5e8d660243c3d0595e3168a670ea7a22ef51b199528ffe826\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beefddcc0b5e2e84c15063f467b9043ba9caea11fbc3f7fb8fc1ece3b8c8f75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dbd2e87a95c7964a1bafb765ec76d7f18e180cc0e47df9fde7b526b3717a107\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cc4f98152f284333d41071adaefada973e3396c3cdb2c6ee3c662f0043fe65b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c762d1facdba3325880ae6b1e2704c626ae67915808b527959795774e2a1ba62\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-16T19:42:02Z\\\",\\\"message\\\":\\\"W0216 19:42:01.228572 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0216 19:42:01.228958 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771270921 cert, and key in /tmp/serving-cert-1169484463/serving-signer.crt, /tmp/serving-cert-1169484463/serving-signer.key\\\\nI0216 19:42:01.683881 1 observer_polling.go:159] Starting file observer\\\\nW0216 19:42:01.687387 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0216 19:42:01.693158 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0216 19:42:01.694799 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1169484463/tls.crt::/tmp/serving-cert-1169484463/tls.key\\\\\\\"\\\\nF0216 19:42:02.048422 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://344f8bd7bee340cc040a2556a0e92c055ed7fbad886f85750b66d21ace766e8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:22Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:22 crc kubenswrapper[4675]: I0216 19:43:22.693729 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3824464b-e8c8-4a8a-a50b-0ba14ad23d54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1eebdfea2b981083cc06feead15d85cc403b59837948716edf1a4649aff1bfc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eea7a27c1d937fcd2e1879b3eb16aacaf9733618822a2064c003a0e69828c4e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53ef6384f8ec4cb5976952d83be543b5a555d2ebcbe603a87889791803b1ae3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb0650dde5c6086d91ae092a0550467170f105f67c5cab5b0789f3c21d0c6966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb0650dde5c6086d91ae092a0550467170f105f67c5cab5b0789f3c21d0c6966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:22Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:22 crc kubenswrapper[4675]: I0216 19:43:22.701431 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 19:43:22 crc kubenswrapper[4675]: I0216 19:43:22.701536 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 19:43:22 crc kubenswrapper[4675]: E0216 19:43:22.701622 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 19:44:26.70159721 +0000 UTC m=+149.826886776 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:43:22 crc kubenswrapper[4675]: E0216 19:43:22.701721 4675 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 16 19:43:22 crc kubenswrapper[4675]: E0216 19:43:22.701752 4675 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 16 19:43:22 crc kubenswrapper[4675]: E0216 19:43:22.701782 4675 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 16 19:43:22 crc kubenswrapper[4675]: I0216 19:43:22.701810 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 19:43:22 crc kubenswrapper[4675]: E0216 19:43:22.701830 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-16 19:44:26.701818046 +0000 UTC m=+149.827107692 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 16 19:43:22 crc kubenswrapper[4675]: E0216 19:43:22.702075 4675 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 16 19:43:22 crc kubenswrapper[4675]: E0216 19:43:22.702123 4675 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 16 19:43:22 crc kubenswrapper[4675]: E0216 19:43:22.702138 4675 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 16 19:43:22 crc kubenswrapper[4675]: E0216 19:43:22.702222 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-16 19:44:26.702199285 +0000 UTC m=+149.827488841 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 16 19:43:22 crc kubenswrapper[4675]: I0216 19:43:22.710558 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18c7d7a1f00611bdd6074314992cb6601ce0c043b831b6125010fe098c7e12ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:22Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:22 crc kubenswrapper[4675]: I0216 19:43:22.713512 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:22 crc kubenswrapper[4675]: I0216 19:43:22.713565 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:22 crc kubenswrapper[4675]: I0216 19:43:22.713579 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:22 crc kubenswrapper[4675]: I0216 19:43:22.713599 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:22 crc kubenswrapper[4675]: I0216 19:43:22.713611 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:22Z","lastTransitionTime":"2026-02-16T19:43:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:22 crc kubenswrapper[4675]: I0216 19:43:22.726658 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:22Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:22 crc kubenswrapper[4675]: I0216 19:43:22.742286 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rg8bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f385df3d-0543-4189-88a6-2163c8b9b959\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f034148fa1b45c35dd6c8b9b70c08b1d745d1377335d7f6d2b35793c29ce9e03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://993eb6f1a848a5279ad2e5d47f47c873918dbca13d2ff7db80bebbad6f948476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://993eb6f1a848a5279ad2e5d47f47c873918dbca13d2ff7db80bebbad6f948476\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02bb51b679cec1e9962a9d52b4fa890f31df7d9c00ed9f4885e94ac52455341c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02bb51b679cec1e9962a9d52b4fa890f31df7d9c00ed9f4885e94ac52455341c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0161d628c911b5e33369708f774f6f8313fb74d7fdc1266b7bfa346d5d6200a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0161d628c911b5e33369708f774f6f8313fb74d7fdc1266b7bfa346d5d6200a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://470da14b61b6a4b60e151726b8a40431c5a38f30416cb9e635e843541e8b28bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://470da14b61b6a4b60e151726b8a40431c5a38f30416cb9e635e843541e8b28bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd74b2f78bf4719b66ec38da6783f0e4ee8b93d27d6c27907212e906c1cd8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbd74b2f78bf4719b66ec38da6783f0e4ee8b93d27d6c27907212e906c1cd8e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f742c6f36d6a537e143339eaa367ff65398f680da2e175aeaed1a4da1886148c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f742c6f36d6a537e143339eaa367ff65398f680da2e175aeaed1a4da1886148c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rg8bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:22Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:22 crc kubenswrapper[4675]: I0216 19:43:22.754484 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4vvwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2de9472b-0e23-4821-86b8-202bfd739aee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2a601bdf26e518854cf3a55f3aed6274701bc40b20fe0f0e7673e65a007f0b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqw54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4vvwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:22Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:22 crc kubenswrapper[4675]: I0216 19:43:22.773195 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"776ad1bc-395c-41c3-8c95-37a58a8cd4e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04dbf4934ecd5c1773fe7c1d032d68d59f41556af022bdf1c36f8de85221616a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abb4dfa515be84263cef6de8c300018193409908292cbfe0d19610ade9532ad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8d3160c906648c15ea910b1b2fa9223c716bbefac62b25bba4f83f3b50217b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd0dd7a8d986a0dd78568e33afa36fcd8bba88cf885d29b22484132bf0034c47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:22Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:22 crc kubenswrapper[4675]: I0216 19:43:22.788794 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04c59a1a3ae53156b955a90113bc3d3b4424b7dcc3baa91e8256fd0893751af8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:22Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:22 crc kubenswrapper[4675]: I0216 19:43:22.800868 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sbgjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d5a5b47-38a4-4f7e-b40e-dba4825e18be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm4b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm4b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sbgjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:22Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:22 crc kubenswrapper[4675]: I0216 19:43:22.815891 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:22 crc kubenswrapper[4675]: I0216 19:43:22.815934 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:22 crc kubenswrapper[4675]: I0216 19:43:22.815946 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:22 crc kubenswrapper[4675]: I0216 19:43:22.815986 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:22 crc kubenswrapper[4675]: I0216 19:43:22.815998 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:22Z","lastTransitionTime":"2026-02-16T19:43:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:22 crc kubenswrapper[4675]: I0216 19:43:22.849402 4675 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 11:09:51.125748049 +0000 UTC Feb 16 19:43:22 crc kubenswrapper[4675]: I0216 19:43:22.884118 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbgjb" Feb 16 19:43:22 crc kubenswrapper[4675]: I0216 19:43:22.884188 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 19:43:22 crc kubenswrapper[4675]: I0216 19:43:22.884210 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 19:43:22 crc kubenswrapper[4675]: I0216 19:43:22.884298 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 19:43:22 crc kubenswrapper[4675]: E0216 19:43:22.884412 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbgjb" podUID="8d5a5b47-38a4-4f7e-b40e-dba4825e18be" Feb 16 19:43:22 crc kubenswrapper[4675]: E0216 19:43:22.884622 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 19:43:22 crc kubenswrapper[4675]: E0216 19:43:22.884713 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 19:43:22 crc kubenswrapper[4675]: E0216 19:43:22.884925 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 19:43:22 crc kubenswrapper[4675]: I0216 19:43:22.926331 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:22 crc kubenswrapper[4675]: I0216 19:43:22.926410 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:22 crc kubenswrapper[4675]: I0216 19:43:22.926430 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:22 crc kubenswrapper[4675]: I0216 19:43:22.926460 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:22 crc kubenswrapper[4675]: I0216 19:43:22.926480 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:22Z","lastTransitionTime":"2026-02-16T19:43:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:23 crc kubenswrapper[4675]: I0216 19:43:23.029807 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:23 crc kubenswrapper[4675]: I0216 19:43:23.029853 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:23 crc kubenswrapper[4675]: I0216 19:43:23.029866 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:23 crc kubenswrapper[4675]: I0216 19:43:23.029886 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:23 crc kubenswrapper[4675]: I0216 19:43:23.029899 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:23Z","lastTransitionTime":"2026-02-16T19:43:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:23 crc kubenswrapper[4675]: I0216 19:43:23.133061 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:23 crc kubenswrapper[4675]: I0216 19:43:23.133117 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:23 crc kubenswrapper[4675]: I0216 19:43:23.133132 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:23 crc kubenswrapper[4675]: I0216 19:43:23.133151 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:23 crc kubenswrapper[4675]: I0216 19:43:23.133161 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:23Z","lastTransitionTime":"2026-02-16T19:43:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:23 crc kubenswrapper[4675]: I0216 19:43:23.235807 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:23 crc kubenswrapper[4675]: I0216 19:43:23.235843 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:23 crc kubenswrapper[4675]: I0216 19:43:23.235853 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:23 crc kubenswrapper[4675]: I0216 19:43:23.235868 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:23 crc kubenswrapper[4675]: I0216 19:43:23.235878 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:23Z","lastTransitionTime":"2026-02-16T19:43:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:23 crc kubenswrapper[4675]: I0216 19:43:23.339167 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:23 crc kubenswrapper[4675]: I0216 19:43:23.339231 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:23 crc kubenswrapper[4675]: I0216 19:43:23.339243 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:23 crc kubenswrapper[4675]: I0216 19:43:23.339260 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:23 crc kubenswrapper[4675]: I0216 19:43:23.339272 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:23Z","lastTransitionTime":"2026-02-16T19:43:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:23 crc kubenswrapper[4675]: I0216 19:43:23.447367 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:23 crc kubenswrapper[4675]: I0216 19:43:23.447484 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:23 crc kubenswrapper[4675]: I0216 19:43:23.447517 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:23 crc kubenswrapper[4675]: I0216 19:43:23.447560 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:23 crc kubenswrapper[4675]: I0216 19:43:23.447590 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:23Z","lastTransitionTime":"2026-02-16T19:43:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:23 crc kubenswrapper[4675]: I0216 19:43:23.451415 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gpc5z_9b6e2d5a-0472-425b-b5b4-0b94f14ebfba/ovnkube-controller/3.log" Feb 16 19:43:23 crc kubenswrapper[4675]: I0216 19:43:23.452241 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gpc5z_9b6e2d5a-0472-425b-b5b4-0b94f14ebfba/ovnkube-controller/2.log" Feb 16 19:43:23 crc kubenswrapper[4675]: I0216 19:43:23.455391 4675 generic.go:334] "Generic (PLEG): container finished" podID="9b6e2d5a-0472-425b-b5b4-0b94f14ebfba" containerID="07781eeef14d0ee02a533c5f1b5b7bdb9fa735d2a02636e9671605e69e6a932c" exitCode=1 Feb 16 19:43:23 crc kubenswrapper[4675]: I0216 19:43:23.455442 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" event={"ID":"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba","Type":"ContainerDied","Data":"07781eeef14d0ee02a533c5f1b5b7bdb9fa735d2a02636e9671605e69e6a932c"} Feb 16 19:43:23 crc kubenswrapper[4675]: I0216 19:43:23.455497 4675 scope.go:117] "RemoveContainer" containerID="5452aa74eda5605a3b5ad24d32c48b20ec9f2c479d5dd388e4307b79245e8f4b" Feb 16 19:43:23 crc kubenswrapper[4675]: I0216 19:43:23.456880 4675 scope.go:117] "RemoveContainer" containerID="07781eeef14d0ee02a533c5f1b5b7bdb9fa735d2a02636e9671605e69e6a932c" Feb 16 19:43:23 crc kubenswrapper[4675]: E0216 19:43:23.457256 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-gpc5z_openshift-ovn-kubernetes(9b6e2d5a-0472-425b-b5b4-0b94f14ebfba)\"" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" podUID="9b6e2d5a-0472-425b-b5b4-0b94f14ebfba" Feb 16 19:43:23 crc kubenswrapper[4675]: I0216 19:43:23.474924 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dff80f0-fe7f-4d8c-9c10-e2775f4d5ae8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d44d6cbe740841967325282ecae8c05f9bf0fc4620d33b5a3da13cee2f1450a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17d4432915444ec16f61697928fa82b49a78aac329f5777d2db8681ae8e2f709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17d4432915444ec16f61697928fa82b49a78aac329f5777d2db8681ae8e2f709\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:23Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:23 crc kubenswrapper[4675]: I0216 19:43:23.493825 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:23Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:23 crc kubenswrapper[4675]: I0216 19:43:23.526376 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"358c3bab-4666-44d1-90fb-3affa22327e2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e0dcd3707a91b9a3869942408c144f0b4ca22cd64e99de5fe43f95094e44a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d3a4d507eb459edf9f2a3c58417ae320eba85a7b597808fdb0eabe7d7d5afbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd404f7efe7c09d47f0550b7cf66ac7193f315f60db74cd5609593546037c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bfda6b7519474256ab63750ba86955592a2e47a0e364e4170a19fe28270459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e9a715621e072260b3d9703159db5eaad4918ff7a2cb7689dcb914ccd617950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e8bfd47afcf3bbd1831b98e69a3ee5fe7b49dd04d4f2efc9d9dc8fda831c74a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e8bfd47afcf3bbd1831b98e69a3ee5fe7b49dd04d4f2efc9d9dc8fda831c74a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65752e9eeab5fea26d84c7d43a33947d3dbabdd31578480904db24d60d298a0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65752e9eeab5fea26d84c7d43a33947d3dbabdd31578480904db24d60d298a0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bd389e534c9b05c81db5cfb0a9f1f5e78e2f1ccbcce15953172836fb3fb6744f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd389e534c9b05c81db5cfb0a9f1f5e78e2f1ccbcce15953172836fb3fb6744f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:23Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:23 crc kubenswrapper[4675]: I0216 19:43:23.542027 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3824464b-e8c8-4a8a-a50b-0ba14ad23d54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1eebdfea2b981083cc06feead15d85cc403b59837948716edf1a4649aff1bfc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eea7a27c1d937fcd2e1879b3eb16aacaf9733618822a2064c003a0e69828c4e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53ef6384f8ec4cb5976952d83be543b5a555d2ebcbe603a87889791803b1ae3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb0650dde5c6086d91ae092a0550467170f105f67c5cab5b0789f3c21d0c6966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb0650dde5c6086d91ae092a0550467170f105f67c5cab5b0789f3c21d0c6966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:23Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:23 crc kubenswrapper[4675]: I0216 19:43:23.550778 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:23 crc kubenswrapper[4675]: I0216 19:43:23.550861 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:23 crc kubenswrapper[4675]: I0216 19:43:23.550881 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:23 crc kubenswrapper[4675]: I0216 19:43:23.550948 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:23 crc kubenswrapper[4675]: I0216 19:43:23.550971 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:23Z","lastTransitionTime":"2026-02-16T19:43:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:23 crc kubenswrapper[4675]: I0216 19:43:23.563452 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18c7d7a1f00611bdd6074314992cb6601ce0c043b831b6125010fe098c7e12ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:23Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:23 crc kubenswrapper[4675]: I0216 19:43:23.582864 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:23Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:23 crc kubenswrapper[4675]: I0216 19:43:23.605931 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:23Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:23 crc kubenswrapper[4675]: I0216 19:43:23.632411 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pj5xg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9a99563-d631-455f-8464-160e5619c610\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:43:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:43:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e43206e68596f3aac0d8c5d9c093d1b5aa57306577c54fc003ab051950ceeac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://798b2a0b186213c744675eb2bd1a33fcb05a8df10795373dbff1f91222fa17a9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-16T19:43:11Z\\\",\\\"message\\\":\\\"2026-02-16T19:42:26+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9b0c1c65-b0b3-415a-8e3c-d91f805b2976\\\\n2026-02-16T19:42:26+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9b0c1c65-b0b3-415a-8e3c-d91f805b2976 to /host/opt/cni/bin/\\\\n2026-02-16T19:42:26Z [verbose] multus-daemon started\\\\n2026-02-16T19:42:26Z [verbose] Readiness Indicator file check\\\\n2026-02-16T19:43:11Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:43:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s94dl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pj5xg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:23Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:23 crc kubenswrapper[4675]: I0216 19:43:23.654781 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:23 crc kubenswrapper[4675]: I0216 19:43:23.654853 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:23 crc kubenswrapper[4675]: I0216 19:43:23.654868 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:23 crc kubenswrapper[4675]: I0216 19:43:23.654890 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:23 crc kubenswrapper[4675]: I0216 19:43:23.654905 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:23Z","lastTransitionTime":"2026-02-16T19:43:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:23 crc kubenswrapper[4675]: I0216 19:43:23.656443 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://958d71da673d02ed4c067f390a0a94c9f20052f6274ec975f9a4c3034efa7aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5282559b57497e15c0bd828702b4994e8d05f7c1a81ee44e582ce3bb3b64cbb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13af49ba7abbe43edde46130d212f6d76f30bab85a63aef18a84721887e29535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e22795f3990f90b901e1ba026e5f95e72508dc865635d19ba881a291dbaed85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9296772284fdb3e0ac56ffd5e54f10e7fe1256ccb616d52f103479ebcc6e81f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ace6ee814e8e429b6ec77e88a0bb2fff35cfbfd3497e7a4d7882e11f437ce5a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07781eeef14d0ee02a533c5f1b5b7bdb9fa735d2a02636e9671605e69e6a932c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5452aa74eda5605a3b5ad24d32c48b20ec9f2c479d5dd388e4307b79245e8f4b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-16T19:42:55Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0216 19:42:55.399413 6340 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0216 19:42:55.399442 6340 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0216 19:42:55.399468 6340 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0216 19:42:55.399515 6340 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0216 19:42:55.399545 6340 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0216 19:42:55.399552 6340 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0216 19:42:55.399802 6340 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0216 19:42:55.399864 6340 factory.go:656] Stopping watch factory\\\\nI0216 19:42:55.399912 6340 ovnkube.go:599] Stopped ovnkube\\\\nI0216 19:42:55.399629 6340 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0216 19:42:55.399637 6340 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0216 19:42:55.399654 6340 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0216 19:42:55.399993 6340 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0216 19:42:55.400004 6340 handler.go:208] Removed *v1.Node event handler 7\\\\nI0216 19:42:55.400012 6340 handler.go:208] Removed *v1.Node event handler 2\\\\nI0216 19:42:5\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07781eeef14d0ee02a533c5f1b5b7bdb9fa735d2a02636e9671605e69e6a932c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-16T19:43:22Z\\\",\\\"message\\\":\\\"d_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/machine-api-controllers]} name:Service_openshift-machine-api/machine-api-controllers_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.167:8441: 10.217.4.167:8442: 10.217.4.167:8444:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {62af83f3-e0c8-4632-aaaa-17488566a9d8}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0216 19:43:22.978417 6730 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-controller-manager/kube-controller-manager]} name:Service_openshift-kube-controller-manager/kube-controller-manager_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.36:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {ba175bbe-5cc4-47e6-a32d-57693e1320bd}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0216 19:43:22.978457 6730 tra\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T19:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4a449e69885bb506900d7dbb14ee3ff9eb2d1188267babb04fbe0bb8f057c48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gpc5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:23Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:23 crc kubenswrapper[4675]: I0216 19:43:23.672242 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10414964-83d0-4d95-a89f-e3212a8015b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db3c34dee6cb538e5de05d8703350e3013401589489756aafbf8fed2ef597d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27l4j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92e33c01bc9214841f42d4ca6e58fb062a9e94fe39bdc5ad08ad2351cd270058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27l4j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j7pnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:23Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:23 crc kubenswrapper[4675]: I0216 19:43:23.693741 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc6bac98-79c6-4970-ba8b-7996a4046f7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055f2d1052fed3e5e8d660243c3d0595e3168a670ea7a22ef51b199528ffe826\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beefddcc0b5e2e84c15063f467b9043ba9caea11fbc3f7fb8fc1ece3b8c8f75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dbd2e87a95c7964a1bafb765ec76d7f18e180cc0e47df9fde7b526b3717a107\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cc4f98152f284333d41071adaefada973e3396c3cdb2c6ee3c662f0043fe65b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c762d1facdba3325880ae6b1e2704c626ae67915808b527959795774e2a1ba62\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-16T19:42:02Z\\\",\\\"message\\\":\\\"W0216 19:42:01.228572 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0216 19:42:01.228958 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771270921 cert, and key in /tmp/serving-cert-1169484463/serving-signer.crt, /tmp/serving-cert-1169484463/serving-signer.key\\\\nI0216 19:42:01.683881 1 observer_polling.go:159] Starting file observer\\\\nW0216 19:42:01.687387 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0216 19:42:01.693158 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0216 19:42:01.694799 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1169484463/tls.crt::/tmp/serving-cert-1169484463/tls.key\\\\\\\"\\\\nF0216 19:42:02.048422 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://344f8bd7bee340cc040a2556a0e92c055ed7fbad886f85750b66d21ace766e8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:23Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:23 crc kubenswrapper[4675]: I0216 19:43:23.709610 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4vvwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2de9472b-0e23-4821-86b8-202bfd739aee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2a601bdf26e518854cf3a55f3aed6274701bc40b20fe0f0e7673e65a007f0b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqw54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4vvwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:23Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:23 crc kubenswrapper[4675]: I0216 19:43:23.730462 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rg8bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f385df3d-0543-4189-88a6-2163c8b9b959\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f034148fa1b45c35dd6c8b9b70c08b1d745d1377335d7f6d2b35793c29ce9e03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://993eb6f1a848a5279ad2e5d47f47c873918dbca13d2ff7db80bebbad6f948476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://993eb6f1a848a5279ad2e5d47f47c873918dbca13d2ff7db80bebbad6f948476\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02bb51b679cec1e9962a9d52b4fa890f31df7d9c00ed9f4885e94ac52455341c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02bb51b679cec1e9962a9d52b4fa890f31df7d9c00ed9f4885e94ac52455341c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0161d628c911b5e33369708f774f6f8313fb74d7fdc1266b7bfa346d5d6200a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0161d628c911b5e33369708f774f6f8313fb74d7fdc1266b7bfa346d5d6200a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://470da14b61b6a4b60e151726b8a40431c5a38f30416cb9e635e843541e8b28bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://470da14b61b6a4b60e151726b8a40431c5a38f30416cb9e635e843541e8b28bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd74b2f78bf4719b66ec38da6783f0e4ee8b93d27d6c27907212e906c1cd8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbd74b2f78bf4719b66ec38da6783f0e4ee8b93d27d6c27907212e906c1cd8e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f742c6f36d6a537e143339eaa367ff65398f680da2e175aeaed1a4da1886148c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f742c6f36d6a537e143339eaa367ff65398f680da2e175aeaed1a4da1886148c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rg8bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:23Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:23 crc kubenswrapper[4675]: I0216 19:43:23.744991 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04c59a1a3ae53156b955a90113bc3d3b4424b7dcc3baa91e8256fd0893751af8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:23Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:23 crc kubenswrapper[4675]: I0216 19:43:23.758084 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sbgjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d5a5b47-38a4-4f7e-b40e-dba4825e18be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm4b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm4b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sbgjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:23Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:23 crc kubenswrapper[4675]: I0216 19:43:23.758285 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:23 crc kubenswrapper[4675]: I0216 19:43:23.758341 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:23 crc kubenswrapper[4675]: I0216 19:43:23.758355 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:23 crc kubenswrapper[4675]: I0216 19:43:23.758375 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:23 crc kubenswrapper[4675]: I0216 19:43:23.758388 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:23Z","lastTransitionTime":"2026-02-16T19:43:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:23 crc kubenswrapper[4675]: I0216 19:43:23.778548 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"776ad1bc-395c-41c3-8c95-37a58a8cd4e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04dbf4934ecd5c1773fe7c1d032d68d59f41556af022bdf1c36f8de85221616a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abb4dfa515be84263cef6de8c300018193409908292cbfe0d19610ade9532ad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8d3160c906648c15ea910b1b2fa9223c716bbefac62b25bba4f83f3b50217b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd0dd7a8d986a0dd78568e33afa36fcd8bba88cf885d29b22484132bf0034c47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:23Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:23 crc kubenswrapper[4675]: I0216 19:43:23.796393 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-stxk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af599569-feb0-432a-9adf-1b72d1ac1a57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5944f34bb922394197cfb79978d19139f70724115954043828e9f609ca32945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-stxk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:23Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:23 crc kubenswrapper[4675]: I0216 19:43:23.809488 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h9s8p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e98c1ed8-cbed-4d30-b3fc-41beff5d65c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://410f4deab4b31698f80da30412dff846ca9f1004dfab5d4cfe67c6892b3a7aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df2gr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20a2a1cc0b69b60d38a38ab898236d8577ce735cd2e1698f6fd3cb3d3c23b62c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df2gr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-h9s8p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:23Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:23 crc kubenswrapper[4675]: I0216 19:43:23.825888 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10e2a1727464113d2cd88dfb8ba52bb1944b3dfe67a0d40f5b2e57fe483e5890\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83a4afeccd65f013b07c0f90cbb94f765f918aaaeeca716935f940deb85403d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:23Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:23 crc kubenswrapper[4675]: I0216 19:43:23.850491 4675 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 03:20:24.182999849 +0000 UTC Feb 16 19:43:23 crc kubenswrapper[4675]: I0216 19:43:23.861754 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:23 crc kubenswrapper[4675]: I0216 19:43:23.861810 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:23 crc kubenswrapper[4675]: I0216 19:43:23.861828 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:23 crc kubenswrapper[4675]: I0216 19:43:23.861856 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:23 crc kubenswrapper[4675]: I0216 19:43:23.861879 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:23Z","lastTransitionTime":"2026-02-16T19:43:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:23 crc kubenswrapper[4675]: I0216 19:43:23.965470 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:23 crc kubenswrapper[4675]: I0216 19:43:23.965535 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:23 crc kubenswrapper[4675]: I0216 19:43:23.965552 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:23 crc kubenswrapper[4675]: I0216 19:43:23.965578 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:23 crc kubenswrapper[4675]: I0216 19:43:23.965599 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:23Z","lastTransitionTime":"2026-02-16T19:43:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:24 crc kubenswrapper[4675]: I0216 19:43:24.068532 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:24 crc kubenswrapper[4675]: I0216 19:43:24.068615 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:24 crc kubenswrapper[4675]: I0216 19:43:24.068644 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:24 crc kubenswrapper[4675]: I0216 19:43:24.068677 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:24 crc kubenswrapper[4675]: I0216 19:43:24.068737 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:24Z","lastTransitionTime":"2026-02-16T19:43:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:24 crc kubenswrapper[4675]: I0216 19:43:24.171962 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:24 crc kubenswrapper[4675]: I0216 19:43:24.172024 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:24 crc kubenswrapper[4675]: I0216 19:43:24.172043 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:24 crc kubenswrapper[4675]: I0216 19:43:24.172070 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:24 crc kubenswrapper[4675]: I0216 19:43:24.172089 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:24Z","lastTransitionTime":"2026-02-16T19:43:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:24 crc kubenswrapper[4675]: I0216 19:43:24.275088 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:24 crc kubenswrapper[4675]: I0216 19:43:24.275144 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:24 crc kubenswrapper[4675]: I0216 19:43:24.275156 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:24 crc kubenswrapper[4675]: I0216 19:43:24.275176 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:24 crc kubenswrapper[4675]: I0216 19:43:24.275190 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:24Z","lastTransitionTime":"2026-02-16T19:43:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:24 crc kubenswrapper[4675]: I0216 19:43:24.378478 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:24 crc kubenswrapper[4675]: I0216 19:43:24.378523 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:24 crc kubenswrapper[4675]: I0216 19:43:24.378533 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:24 crc kubenswrapper[4675]: I0216 19:43:24.378549 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:24 crc kubenswrapper[4675]: I0216 19:43:24.378559 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:24Z","lastTransitionTime":"2026-02-16T19:43:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:24 crc kubenswrapper[4675]: I0216 19:43:24.460905 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gpc5z_9b6e2d5a-0472-425b-b5b4-0b94f14ebfba/ovnkube-controller/3.log" Feb 16 19:43:24 crc kubenswrapper[4675]: I0216 19:43:24.465986 4675 scope.go:117] "RemoveContainer" containerID="07781eeef14d0ee02a533c5f1b5b7bdb9fa735d2a02636e9671605e69e6a932c" Feb 16 19:43:24 crc kubenswrapper[4675]: E0216 19:43:24.466181 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-gpc5z_openshift-ovn-kubernetes(9b6e2d5a-0472-425b-b5b4-0b94f14ebfba)\"" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" podUID="9b6e2d5a-0472-425b-b5b4-0b94f14ebfba" Feb 16 19:43:24 crc kubenswrapper[4675]: I0216 19:43:24.482259 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:24 crc kubenswrapper[4675]: I0216 19:43:24.482337 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:24 crc kubenswrapper[4675]: I0216 19:43:24.482362 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:24 crc kubenswrapper[4675]: I0216 19:43:24.482391 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:24 crc kubenswrapper[4675]: I0216 19:43:24.482413 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:24Z","lastTransitionTime":"2026-02-16T19:43:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:24 crc kubenswrapper[4675]: I0216 19:43:24.514855 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"358c3bab-4666-44d1-90fb-3affa22327e2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e0dcd3707a91b9a3869942408c144f0b4ca22cd64e99de5fe43f95094e44a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d3a4d507eb459edf9f2a3c58417ae320eba85a7b597808fdb0eabe7d7d5afbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd404f7efe7c09d47f0550b7cf66ac7193f315f60db74cd5609593546037c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bfda6b7519474256ab63750ba86955592a2e47a0e364e4170a19fe28270459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e9a715621e072260b3d9703159db5eaad4918ff7a2cb7689dcb914ccd617950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e8bfd47afcf3bbd1831b98e69a3ee5fe7b49dd04d4f2efc9d9dc8fda831c74a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e8bfd47afcf3bbd1831b98e69a3ee5fe7b49dd04d4f2efc9d9dc8fda831c74a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65752e9eeab5fea26d84c7d43a33947d3dbabdd31578480904db24d60d298a0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65752e9eeab5fea26d84c7d43a33947d3dbabdd31578480904db24d60d298a0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bd389e534c9b05c81db5cfb0a9f1f5e78e2f1ccbcce15953172836fb3fb6744f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd389e534c9b05c81db5cfb0a9f1f5e78e2f1ccbcce15953172836fb3fb6744f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:24Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:24 crc kubenswrapper[4675]: I0216 19:43:24.541091 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dff80f0-fe7f-4d8c-9c10-e2775f4d5ae8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d44d6cbe740841967325282ecae8c05f9bf0fc4620d33b5a3da13cee2f1450a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17d4432915444ec16f61697928fa82b49a78aac329f5777d2db8681ae8e2f709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17d4432915444ec16f61697928fa82b49a78aac329f5777d2db8681ae8e2f709\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:24Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:24 crc kubenswrapper[4675]: I0216 19:43:24.561464 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:24Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:24 crc kubenswrapper[4675]: I0216 19:43:24.582761 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:24Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:24 crc kubenswrapper[4675]: I0216 19:43:24.585185 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:24 crc kubenswrapper[4675]: I0216 19:43:24.585250 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:24 crc kubenswrapper[4675]: I0216 19:43:24.585269 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:24 crc kubenswrapper[4675]: I0216 19:43:24.585294 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:24 crc kubenswrapper[4675]: I0216 19:43:24.585312 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:24Z","lastTransitionTime":"2026-02-16T19:43:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:24 crc kubenswrapper[4675]: I0216 19:43:24.598898 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:24Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:24 crc kubenswrapper[4675]: I0216 19:43:24.613424 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pj5xg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9a99563-d631-455f-8464-160e5619c610\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:43:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:43:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e43206e68596f3aac0d8c5d9c093d1b5aa57306577c54fc003ab051950ceeac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://798b2a0b186213c744675eb2bd1a33fcb05a8df10795373dbff1f91222fa17a9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-16T19:43:11Z\\\",\\\"message\\\":\\\"2026-02-16T19:42:26+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9b0c1c65-b0b3-415a-8e3c-d91f805b2976\\\\n2026-02-16T19:42:26+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9b0c1c65-b0b3-415a-8e3c-d91f805b2976 to /host/opt/cni/bin/\\\\n2026-02-16T19:42:26Z [verbose] multus-daemon started\\\\n2026-02-16T19:42:26Z [verbose] Readiness Indicator file check\\\\n2026-02-16T19:43:11Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:43:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s94dl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pj5xg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:24Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:24 crc kubenswrapper[4675]: I0216 19:43:24.636440 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://958d71da673d02ed4c067f390a0a94c9f20052f6274ec975f9a4c3034efa7aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5282559b57497e15c0bd828702b4994e8d05f7c1a81ee44e582ce3bb3b64cbb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13af49ba7abbe43edde46130d212f6d76f30bab85a63aef18a84721887e29535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e22795f3990f90b901e1ba026e5f95e72508dc865635d19ba881a291dbaed85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9296772284fdb3e0ac56ffd5e54f10e7fe1256ccb616d52f103479ebcc6e81f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ace6ee814e8e429b6ec77e88a0bb2fff35cfbfd3497e7a4d7882e11f437ce5a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07781eeef14d0ee02a533c5f1b5b7bdb9fa735d2a02636e9671605e69e6a932c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07781eeef14d0ee02a533c5f1b5b7bdb9fa735d2a02636e9671605e69e6a932c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-16T19:43:22Z\\\",\\\"message\\\":\\\"d_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/machine-api-controllers]} name:Service_openshift-machine-api/machine-api-controllers_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.167:8441: 10.217.4.167:8442: 10.217.4.167:8444:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {62af83f3-e0c8-4632-aaaa-17488566a9d8}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0216 19:43:22.978417 6730 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-controller-manager/kube-controller-manager]} name:Service_openshift-kube-controller-manager/kube-controller-manager_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.36:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {ba175bbe-5cc4-47e6-a32d-57693e1320bd}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0216 19:43:22.978457 6730 tra\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T19:43:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-gpc5z_openshift-ovn-kubernetes(9b6e2d5a-0472-425b-b5b4-0b94f14ebfba)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4a449e69885bb506900d7dbb14ee3ff9eb2d1188267babb04fbe0bb8f057c48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbc5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gpc5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:24Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:24 crc kubenswrapper[4675]: I0216 19:43:24.654504 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10414964-83d0-4d95-a89f-e3212a8015b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db3c34dee6cb538e5de05d8703350e3013401589489756aafbf8fed2ef597d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27l4j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92e33c01bc9214841f42d4ca6e58fb062a9e94fe39bdc5ad08ad2351cd270058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27l4j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j7pnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:24Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:24 crc kubenswrapper[4675]: I0216 19:43:24.671415 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc6bac98-79c6-4970-ba8b-7996a4046f7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055f2d1052fed3e5e8d660243c3d0595e3168a670ea7a22ef51b199528ffe826\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beefddcc0b5e2e84c15063f467b9043ba9caea11fbc3f7fb8fc1ece3b8c8f75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dbd2e87a95c7964a1bafb765ec76d7f18e180cc0e47df9fde7b526b3717a107\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cc4f98152f284333d41071adaefada973e3396c3cdb2c6ee3c662f0043fe65b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c762d1facdba3325880ae6b1e2704c626ae67915808b527959795774e2a1ba62\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-16T19:42:02Z\\\",\\\"message\\\":\\\"W0216 19:42:01.228572 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0216 19:42:01.228958 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771270921 cert, and key in /tmp/serving-cert-1169484463/serving-signer.crt, /tmp/serving-cert-1169484463/serving-signer.key\\\\nI0216 19:42:01.683881 1 observer_polling.go:159] Starting file observer\\\\nW0216 19:42:01.687387 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0216 19:42:01.693158 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0216 19:42:01.694799 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1169484463/tls.crt::/tmp/serving-cert-1169484463/tls.key\\\\\\\"\\\\nF0216 19:42:02.048422 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://344f8bd7bee340cc040a2556a0e92c055ed7fbad886f85750b66d21ace766e8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:24Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:24 crc kubenswrapper[4675]: I0216 19:43:24.689063 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:24 crc kubenswrapper[4675]: I0216 19:43:24.689122 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:24 crc kubenswrapper[4675]: I0216 19:43:24.689142 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:24 crc kubenswrapper[4675]: I0216 19:43:24.689168 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:24 crc kubenswrapper[4675]: I0216 19:43:24.689188 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:24Z","lastTransitionTime":"2026-02-16T19:43:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:24 crc kubenswrapper[4675]: I0216 19:43:24.690126 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3824464b-e8c8-4a8a-a50b-0ba14ad23d54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1eebdfea2b981083cc06feead15d85cc403b59837948716edf1a4649aff1bfc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eea7a27c1d937fcd2e1879b3eb16aacaf9733618822a2064c003a0e69828c4e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53ef6384f8ec4cb5976952d83be543b5a555d2ebcbe603a87889791803b1ae3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb0650dde5c6086d91ae092a0550467170f105f67c5cab5b0789f3c21d0c6966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb0650dde5c6086d91ae092a0550467170f105f67c5cab5b0789f3c21d0c6966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:24Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:24 crc kubenswrapper[4675]: I0216 19:43:24.710909 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18c7d7a1f00611bdd6074314992cb6601ce0c043b831b6125010fe098c7e12ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:24Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:24 crc kubenswrapper[4675]: I0216 19:43:24.732130 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rg8bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f385df3d-0543-4189-88a6-2163c8b9b959\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f034148fa1b45c35dd6c8b9b70c08b1d745d1377335d7f6d2b35793c29ce9e03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://993eb6f1a848a5279ad2e5d47f47c873918dbca13d2ff7db80bebbad6f948476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://993eb6f1a848a5279ad2e5d47f47c873918dbca13d2ff7db80bebbad6f948476\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02bb51b679cec1e9962a9d52b4fa890f31df7d9c00ed9f4885e94ac52455341c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02bb51b679cec1e9962a9d52b4fa890f31df7d9c00ed9f4885e94ac52455341c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0161d628c911b5e33369708f774f6f8313fb74d7fdc1266b7bfa346d5d6200a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0161d628c911b5e33369708f774f6f8313fb74d7fdc1266b7bfa346d5d6200a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://470da14b61b6a4b60e151726b8a40431c5a38f30416cb9e635e843541e8b28bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://470da14b61b6a4b60e151726b8a40431c5a38f30416cb9e635e843541e8b28bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd74b2f78bf4719b66ec38da6783f0e4ee8b93d27d6c27907212e906c1cd8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbd74b2f78bf4719b66ec38da6783f0e4ee8b93d27d6c27907212e906c1cd8e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f742c6f36d6a537e143339eaa367ff65398f680da2e175aeaed1a4da1886148c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f742c6f36d6a537e143339eaa367ff65398f680da2e175aeaed1a4da1886148c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:42:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f77s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rg8bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:24Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:24 crc kubenswrapper[4675]: I0216 19:43:24.744955 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4vvwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2de9472b-0e23-4821-86b8-202bfd739aee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2a601bdf26e518854cf3a55f3aed6274701bc40b20fe0f0e7673e65a007f0b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqw54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4vvwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:24Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:24 crc kubenswrapper[4675]: I0216 19:43:24.761457 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"776ad1bc-395c-41c3-8c95-37a58a8cd4e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04dbf4934ecd5c1773fe7c1d032d68d59f41556af022bdf1c36f8de85221616a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abb4dfa515be84263cef6de8c300018193409908292cbfe0d19610ade9532ad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8d3160c906648c15ea910b1b2fa9223c716bbefac62b25bba4f83f3b50217b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd0dd7a8d986a0dd78568e33afa36fcd8bba88cf885d29b22484132bf0034c47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:41:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:24Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:24 crc kubenswrapper[4675]: I0216 19:43:24.776425 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04c59a1a3ae53156b955a90113bc3d3b4424b7dcc3baa91e8256fd0893751af8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:24Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:24 crc kubenswrapper[4675]: I0216 19:43:24.789118 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sbgjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d5a5b47-38a4-4f7e-b40e-dba4825e18be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm4b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm4b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sbgjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:24Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:24 crc kubenswrapper[4675]: I0216 19:43:24.792495 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:24 crc kubenswrapper[4675]: I0216 19:43:24.792551 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:24 crc kubenswrapper[4675]: I0216 19:43:24.792565 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:24 crc kubenswrapper[4675]: I0216 19:43:24.792585 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:24 crc kubenswrapper[4675]: I0216 19:43:24.792596 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:24Z","lastTransitionTime":"2026-02-16T19:43:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:24 crc kubenswrapper[4675]: I0216 19:43:24.807247 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10e2a1727464113d2cd88dfb8ba52bb1944b3dfe67a0d40f5b2e57fe483e5890\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83a4afeccd65f013b07c0f90cbb94f765f918aaaeeca716935f940deb85403d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:24Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:24 crc kubenswrapper[4675]: I0216 19:43:24.821770 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-stxk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af599569-feb0-432a-9adf-1b72d1ac1a57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5944f34bb922394197cfb79978d19139f70724115954043828e9f609ca32945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-stxk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:24Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:24 crc kubenswrapper[4675]: I0216 19:43:24.835198 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h9s8p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e98c1ed8-cbed-4d30-b3fc-41beff5d65c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://410f4deab4b31698f80da30412dff846ca9f1004dfab5d4cfe67c6892b3a7aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df2gr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20a2a1cc0b69b60d38a38ab898236d8577ce735cd2e1698f6fd3cb3d3c23b62c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df2gr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:42:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-h9s8p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:24Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:24 crc kubenswrapper[4675]: I0216 19:43:24.851372 4675 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 08:31:08.610385916 +0000 UTC Feb 16 19:43:24 crc kubenswrapper[4675]: I0216 19:43:24.883338 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 19:43:24 crc kubenswrapper[4675]: E0216 19:43:24.883573 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 19:43:24 crc kubenswrapper[4675]: I0216 19:43:24.883769 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 19:43:24 crc kubenswrapper[4675]: E0216 19:43:24.883848 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 19:43:24 crc kubenswrapper[4675]: I0216 19:43:24.883910 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 19:43:24 crc kubenswrapper[4675]: E0216 19:43:24.883954 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 19:43:24 crc kubenswrapper[4675]: I0216 19:43:24.884086 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbgjb" Feb 16 19:43:24 crc kubenswrapper[4675]: E0216 19:43:24.884299 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbgjb" podUID="8d5a5b47-38a4-4f7e-b40e-dba4825e18be" Feb 16 19:43:24 crc kubenswrapper[4675]: I0216 19:43:24.895654 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:24 crc kubenswrapper[4675]: I0216 19:43:24.895712 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:24 crc kubenswrapper[4675]: I0216 19:43:24.895723 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:24 crc kubenswrapper[4675]: I0216 19:43:24.895742 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:24 crc kubenswrapper[4675]: I0216 19:43:24.895755 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:24Z","lastTransitionTime":"2026-02-16T19:43:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:24 crc kubenswrapper[4675]: I0216 19:43:24.999008 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:24 crc kubenswrapper[4675]: I0216 19:43:24.999090 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:24 crc kubenswrapper[4675]: I0216 19:43:24.999111 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:24 crc kubenswrapper[4675]: I0216 19:43:24.999164 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:24 crc kubenswrapper[4675]: I0216 19:43:24.999182 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:24Z","lastTransitionTime":"2026-02-16T19:43:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:25 crc kubenswrapper[4675]: I0216 19:43:25.102580 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:25 crc kubenswrapper[4675]: I0216 19:43:25.102645 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:25 crc kubenswrapper[4675]: I0216 19:43:25.102666 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:25 crc kubenswrapper[4675]: I0216 19:43:25.102724 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:25 crc kubenswrapper[4675]: I0216 19:43:25.102743 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:25Z","lastTransitionTime":"2026-02-16T19:43:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:25 crc kubenswrapper[4675]: I0216 19:43:25.206021 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:25 crc kubenswrapper[4675]: I0216 19:43:25.206099 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:25 crc kubenswrapper[4675]: I0216 19:43:25.206119 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:25 crc kubenswrapper[4675]: I0216 19:43:25.206152 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:25 crc kubenswrapper[4675]: I0216 19:43:25.206172 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:25Z","lastTransitionTime":"2026-02-16T19:43:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:25 crc kubenswrapper[4675]: I0216 19:43:25.309888 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:25 crc kubenswrapper[4675]: I0216 19:43:25.310002 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:25 crc kubenswrapper[4675]: I0216 19:43:25.310028 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:25 crc kubenswrapper[4675]: I0216 19:43:25.310063 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:25 crc kubenswrapper[4675]: I0216 19:43:25.310085 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:25Z","lastTransitionTime":"2026-02-16T19:43:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:25 crc kubenswrapper[4675]: I0216 19:43:25.413280 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:25 crc kubenswrapper[4675]: I0216 19:43:25.413362 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:25 crc kubenswrapper[4675]: I0216 19:43:25.413387 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:25 crc kubenswrapper[4675]: I0216 19:43:25.413426 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:25 crc kubenswrapper[4675]: I0216 19:43:25.413452 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:25Z","lastTransitionTime":"2026-02-16T19:43:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:25 crc kubenswrapper[4675]: I0216 19:43:25.518421 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:25 crc kubenswrapper[4675]: I0216 19:43:25.518547 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:25 crc kubenswrapper[4675]: I0216 19:43:25.518566 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:25 crc kubenswrapper[4675]: I0216 19:43:25.518592 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:25 crc kubenswrapper[4675]: I0216 19:43:25.518636 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:25Z","lastTransitionTime":"2026-02-16T19:43:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:25 crc kubenswrapper[4675]: I0216 19:43:25.622402 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:25 crc kubenswrapper[4675]: I0216 19:43:25.622481 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:25 crc kubenswrapper[4675]: I0216 19:43:25.622500 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:25 crc kubenswrapper[4675]: I0216 19:43:25.622530 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:25 crc kubenswrapper[4675]: I0216 19:43:25.622551 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:25Z","lastTransitionTime":"2026-02-16T19:43:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:25 crc kubenswrapper[4675]: I0216 19:43:25.725721 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:25 crc kubenswrapper[4675]: I0216 19:43:25.725814 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:25 crc kubenswrapper[4675]: I0216 19:43:25.725833 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:25 crc kubenswrapper[4675]: I0216 19:43:25.725855 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:25 crc kubenswrapper[4675]: I0216 19:43:25.725903 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:25Z","lastTransitionTime":"2026-02-16T19:43:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:25 crc kubenswrapper[4675]: I0216 19:43:25.829631 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:25 crc kubenswrapper[4675]: I0216 19:43:25.829752 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:25 crc kubenswrapper[4675]: I0216 19:43:25.829778 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:25 crc kubenswrapper[4675]: I0216 19:43:25.829831 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:25 crc kubenswrapper[4675]: I0216 19:43:25.829855 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:25Z","lastTransitionTime":"2026-02-16T19:43:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:25 crc kubenswrapper[4675]: I0216 19:43:25.852473 4675 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 07:30:02.679575889 +0000 UTC Feb 16 19:43:25 crc kubenswrapper[4675]: I0216 19:43:25.932550 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:25 crc kubenswrapper[4675]: I0216 19:43:25.932625 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:25 crc kubenswrapper[4675]: I0216 19:43:25.932650 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:25 crc kubenswrapper[4675]: I0216 19:43:25.932680 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:25 crc kubenswrapper[4675]: I0216 19:43:25.932738 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:25Z","lastTransitionTime":"2026-02-16T19:43:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:26 crc kubenswrapper[4675]: I0216 19:43:26.036320 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:26 crc kubenswrapper[4675]: I0216 19:43:26.036357 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:26 crc kubenswrapper[4675]: I0216 19:43:26.036393 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:26 crc kubenswrapper[4675]: I0216 19:43:26.036412 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:26 crc kubenswrapper[4675]: I0216 19:43:26.036429 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:26Z","lastTransitionTime":"2026-02-16T19:43:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:26 crc kubenswrapper[4675]: I0216 19:43:26.139167 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:26 crc kubenswrapper[4675]: I0216 19:43:26.139214 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:26 crc kubenswrapper[4675]: I0216 19:43:26.139227 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:26 crc kubenswrapper[4675]: I0216 19:43:26.139247 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:26 crc kubenswrapper[4675]: I0216 19:43:26.139261 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:26Z","lastTransitionTime":"2026-02-16T19:43:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:26 crc kubenswrapper[4675]: I0216 19:43:26.242178 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:26 crc kubenswrapper[4675]: I0216 19:43:26.242255 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:26 crc kubenswrapper[4675]: I0216 19:43:26.242284 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:26 crc kubenswrapper[4675]: I0216 19:43:26.242325 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:26 crc kubenswrapper[4675]: I0216 19:43:26.242366 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:26Z","lastTransitionTime":"2026-02-16T19:43:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:26 crc kubenswrapper[4675]: I0216 19:43:26.346133 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:26 crc kubenswrapper[4675]: I0216 19:43:26.346217 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:26 crc kubenswrapper[4675]: I0216 19:43:26.346242 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:26 crc kubenswrapper[4675]: I0216 19:43:26.346280 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:26 crc kubenswrapper[4675]: I0216 19:43:26.346304 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:26Z","lastTransitionTime":"2026-02-16T19:43:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:26 crc kubenswrapper[4675]: I0216 19:43:26.449278 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:26 crc kubenswrapper[4675]: I0216 19:43:26.449334 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:26 crc kubenswrapper[4675]: I0216 19:43:26.449348 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:26 crc kubenswrapper[4675]: I0216 19:43:26.449370 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:26 crc kubenswrapper[4675]: I0216 19:43:26.449384 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:26Z","lastTransitionTime":"2026-02-16T19:43:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:26 crc kubenswrapper[4675]: I0216 19:43:26.553440 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:26 crc kubenswrapper[4675]: I0216 19:43:26.553501 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:26 crc kubenswrapper[4675]: I0216 19:43:26.553514 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:26 crc kubenswrapper[4675]: I0216 19:43:26.553536 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:26 crc kubenswrapper[4675]: I0216 19:43:26.553548 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:26Z","lastTransitionTime":"2026-02-16T19:43:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:26 crc kubenswrapper[4675]: I0216 19:43:26.655888 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:26 crc kubenswrapper[4675]: I0216 19:43:26.655983 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:26 crc kubenswrapper[4675]: I0216 19:43:26.656001 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:26 crc kubenswrapper[4675]: I0216 19:43:26.656033 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:26 crc kubenswrapper[4675]: I0216 19:43:26.656061 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:26Z","lastTransitionTime":"2026-02-16T19:43:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:26 crc kubenswrapper[4675]: I0216 19:43:26.759574 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:26 crc kubenswrapper[4675]: I0216 19:43:26.759629 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:26 crc kubenswrapper[4675]: I0216 19:43:26.759639 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:26 crc kubenswrapper[4675]: I0216 19:43:26.759660 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:26 crc kubenswrapper[4675]: I0216 19:43:26.759671 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:26Z","lastTransitionTime":"2026-02-16T19:43:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:26 crc kubenswrapper[4675]: I0216 19:43:26.853091 4675 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 09:20:41.691947842 +0000 UTC Feb 16 19:43:26 crc kubenswrapper[4675]: I0216 19:43:26.863052 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:26 crc kubenswrapper[4675]: I0216 19:43:26.863107 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:26 crc kubenswrapper[4675]: I0216 19:43:26.863124 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:26 crc kubenswrapper[4675]: I0216 19:43:26.863148 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:26 crc kubenswrapper[4675]: I0216 19:43:26.863167 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:26Z","lastTransitionTime":"2026-02-16T19:43:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:26 crc kubenswrapper[4675]: I0216 19:43:26.883516 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbgjb" Feb 16 19:43:26 crc kubenswrapper[4675]: I0216 19:43:26.883599 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 19:43:26 crc kubenswrapper[4675]: I0216 19:43:26.883726 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 19:43:26 crc kubenswrapper[4675]: I0216 19:43:26.883732 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 19:43:26 crc kubenswrapper[4675]: E0216 19:43:26.883919 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbgjb" podUID="8d5a5b47-38a4-4f7e-b40e-dba4825e18be" Feb 16 19:43:26 crc kubenswrapper[4675]: E0216 19:43:26.884183 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 19:43:26 crc kubenswrapper[4675]: E0216 19:43:26.884297 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 19:43:26 crc kubenswrapper[4675]: E0216 19:43:26.884471 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 19:43:26 crc kubenswrapper[4675]: I0216 19:43:26.968067 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:26 crc kubenswrapper[4675]: I0216 19:43:26.968129 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:26 crc kubenswrapper[4675]: I0216 19:43:26.968146 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:26 crc kubenswrapper[4675]: I0216 19:43:26.968169 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:26 crc kubenswrapper[4675]: I0216 19:43:26.968187 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:26Z","lastTransitionTime":"2026-02-16T19:43:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:27 crc kubenswrapper[4675]: I0216 19:43:27.007056 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:27 crc kubenswrapper[4675]: I0216 19:43:27.007124 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:27 crc kubenswrapper[4675]: I0216 19:43:27.007142 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:27 crc kubenswrapper[4675]: I0216 19:43:27.007167 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:27 crc kubenswrapper[4675]: I0216 19:43:27.007188 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:27Z","lastTransitionTime":"2026-02-16T19:43:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:27 crc kubenswrapper[4675]: E0216 19:43:27.026845 4675 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:43:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:43:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:43:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:43:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:43:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:43:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:43:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:43:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dc074e08-7429-4b93-941d-3d8335487f37\\\",\\\"systemUUID\\\":\\\"a98cec76-f68c-4af0-9517-b326e9616347\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:27Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:27 crc kubenswrapper[4675]: I0216 19:43:27.031174 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:27 crc kubenswrapper[4675]: I0216 19:43:27.031203 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:27 crc kubenswrapper[4675]: I0216 19:43:27.031213 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:27 crc kubenswrapper[4675]: I0216 19:43:27.031228 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:27 crc kubenswrapper[4675]: I0216 19:43:27.031240 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:27Z","lastTransitionTime":"2026-02-16T19:43:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:27 crc kubenswrapper[4675]: E0216 19:43:27.047885 4675 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:43:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:43:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:43:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:43:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:43:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:43:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:43:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:43:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dc074e08-7429-4b93-941d-3d8335487f37\\\",\\\"systemUUID\\\":\\\"a98cec76-f68c-4af0-9517-b326e9616347\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:27Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:27 crc kubenswrapper[4675]: I0216 19:43:27.053413 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:27 crc kubenswrapper[4675]: I0216 19:43:27.053471 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:27 crc kubenswrapper[4675]: I0216 19:43:27.053489 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:27 crc kubenswrapper[4675]: I0216 19:43:27.053521 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:27 crc kubenswrapper[4675]: I0216 19:43:27.053538 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:27Z","lastTransitionTime":"2026-02-16T19:43:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:27 crc kubenswrapper[4675]: E0216 19:43:27.075470 4675 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:43:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:43:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:43:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:43:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:43:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:43:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:43:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:43:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dc074e08-7429-4b93-941d-3d8335487f37\\\",\\\"systemUUID\\\":\\\"a98cec76-f68c-4af0-9517-b326e9616347\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:27Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:27 crc kubenswrapper[4675]: I0216 19:43:27.080943 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:27 crc kubenswrapper[4675]: I0216 19:43:27.081023 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:27 crc kubenswrapper[4675]: I0216 19:43:27.081048 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:27 crc kubenswrapper[4675]: I0216 19:43:27.081083 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:27 crc kubenswrapper[4675]: I0216 19:43:27.081107 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:27Z","lastTransitionTime":"2026-02-16T19:43:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:27 crc kubenswrapper[4675]: E0216 19:43:27.100223 4675 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:43:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:43:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:43:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:43:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:43:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:43:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:43:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:43:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dc074e08-7429-4b93-941d-3d8335487f37\\\",\\\"systemUUID\\\":\\\"a98cec76-f68c-4af0-9517-b326e9616347\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:27Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:27 crc kubenswrapper[4675]: I0216 19:43:27.105355 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:27 crc kubenswrapper[4675]: I0216 19:43:27.105400 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:27 crc kubenswrapper[4675]: I0216 19:43:27.105419 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:27 crc kubenswrapper[4675]: I0216 19:43:27.105443 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:27 crc kubenswrapper[4675]: I0216 19:43:27.105464 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:27Z","lastTransitionTime":"2026-02-16T19:43:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:27 crc kubenswrapper[4675]: E0216 19:43:27.125833 4675 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:43:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:43:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:43:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:43:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:43:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:43:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:43:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T19:43:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dc074e08-7429-4b93-941d-3d8335487f37\\\",\\\"systemUUID\\\":\\\"a98cec76-f68c-4af0-9517-b326e9616347\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:27Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:27 crc kubenswrapper[4675]: E0216 19:43:27.126089 4675 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 16 19:43:27 crc kubenswrapper[4675]: I0216 19:43:27.128610 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:27 crc kubenswrapper[4675]: I0216 19:43:27.128674 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:27 crc kubenswrapper[4675]: I0216 19:43:27.128728 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:27 crc kubenswrapper[4675]: I0216 19:43:27.128758 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:27 crc kubenswrapper[4675]: I0216 19:43:27.128779 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:27Z","lastTransitionTime":"2026-02-16T19:43:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:27 crc kubenswrapper[4675]: I0216 19:43:27.232808 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:27 crc kubenswrapper[4675]: I0216 19:43:27.232869 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:27 crc kubenswrapper[4675]: I0216 19:43:27.232883 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:27 crc kubenswrapper[4675]: I0216 19:43:27.232904 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:27 crc kubenswrapper[4675]: I0216 19:43:27.232919 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:27Z","lastTransitionTime":"2026-02-16T19:43:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:27 crc kubenswrapper[4675]: I0216 19:43:27.336147 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:27 crc kubenswrapper[4675]: I0216 19:43:27.336214 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:27 crc kubenswrapper[4675]: I0216 19:43:27.336230 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:27 crc kubenswrapper[4675]: I0216 19:43:27.336257 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:27 crc kubenswrapper[4675]: I0216 19:43:27.336275 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:27Z","lastTransitionTime":"2026-02-16T19:43:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:27 crc kubenswrapper[4675]: I0216 19:43:27.439316 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:27 crc kubenswrapper[4675]: I0216 19:43:27.439363 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:27 crc kubenswrapper[4675]: I0216 19:43:27.439377 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:27 crc kubenswrapper[4675]: I0216 19:43:27.439393 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:27 crc kubenswrapper[4675]: I0216 19:43:27.439405 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:27Z","lastTransitionTime":"2026-02-16T19:43:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:27 crc kubenswrapper[4675]: I0216 19:43:27.542470 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:27 crc kubenswrapper[4675]: I0216 19:43:27.542550 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:27 crc kubenswrapper[4675]: I0216 19:43:27.542562 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:27 crc kubenswrapper[4675]: I0216 19:43:27.542580 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:27 crc kubenswrapper[4675]: I0216 19:43:27.542591 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:27Z","lastTransitionTime":"2026-02-16T19:43:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:27 crc kubenswrapper[4675]: I0216 19:43:27.646589 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:27 crc kubenswrapper[4675]: I0216 19:43:27.646665 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:27 crc kubenswrapper[4675]: I0216 19:43:27.646720 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:27 crc kubenswrapper[4675]: I0216 19:43:27.646748 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:27 crc kubenswrapper[4675]: I0216 19:43:27.646773 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:27Z","lastTransitionTime":"2026-02-16T19:43:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:27 crc kubenswrapper[4675]: I0216 19:43:27.750574 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:27 crc kubenswrapper[4675]: I0216 19:43:27.750650 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:27 crc kubenswrapper[4675]: I0216 19:43:27.750664 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:27 crc kubenswrapper[4675]: I0216 19:43:27.750729 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:27 crc kubenswrapper[4675]: I0216 19:43:27.750748 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:27Z","lastTransitionTime":"2026-02-16T19:43:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:27 crc kubenswrapper[4675]: I0216 19:43:27.853532 4675 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 07:05:40.931351906 +0000 UTC Feb 16 19:43:27 crc kubenswrapper[4675]: I0216 19:43:27.854924 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:27 crc kubenswrapper[4675]: I0216 19:43:27.854994 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:27 crc kubenswrapper[4675]: I0216 19:43:27.855020 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:27 crc kubenswrapper[4675]: I0216 19:43:27.855049 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:27 crc kubenswrapper[4675]: I0216 19:43:27.855067 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:27Z","lastTransitionTime":"2026-02-16T19:43:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:27 crc kubenswrapper[4675]: I0216 19:43:27.907097 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc6bac98-79c6-4970-ba8b-7996a4046f7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055f2d1052fed3e5e8d660243c3d0595e3168a670ea7a22ef51b199528ffe826\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beefddcc0b5e2e84c15063f467b9043ba9caea11fbc3f7fb8fc1ece3b8c8f75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dbd2e87a95c7964a1bafb765ec76d7f18e180cc0e47df9fde7b526b3717a107\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cc4f98152f284333d41071adaefada973e3396c3cdb2c6ee3c662f0043fe65b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c762d1facdba3325880ae6b1e2704c626ae67915808b527959795774e2a1ba62\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-16T19:42:02Z\\\",\\\"message\\\":\\\"W0216 19:42:01.228572 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0216 19:42:01.228958 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771270921 cert, and key in /tmp/serving-cert-1169484463/serving-signer.crt, /tmp/serving-cert-1169484463/serving-signer.key\\\\nI0216 19:42:01.683881 1 observer_polling.go:159] Starting file observer\\\\nW0216 19:42:01.687387 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0216 19:42:01.693158 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0216 19:42:01.694799 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1169484463/tls.crt::/tmp/serving-cert-1169484463/tls.key\\\\\\\"\\\\nF0216 19:42:02.048422 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T19:42:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://344f8bd7bee340cc040a2556a0e92c055ed7fbad886f85750b66d21ace766e8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:27Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:27 crc kubenswrapper[4675]: I0216 19:43:27.925345 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3824464b-e8c8-4a8a-a50b-0ba14ad23d54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T19:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1eebdfea2b981083cc06feead15d85cc403b59837948716edf1a4649aff1bfc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eea7a27c1d937fcd2e1879b3eb16aacaf9733618822a2064c003a0e69828c4e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53ef6384f8ec4cb5976952d83be543b5a555d2ebcbe603a87889791803b1ae3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb0650dde5c6086d91ae092a0550467170f105f67c5cab5b0789f3c21d0c6966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb0650dde5c6086d91ae092a0550467170f105f67c5cab5b0789f3c21d0c6966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T19:41:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T19:41:58Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T19:41:57Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:27Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:27 crc kubenswrapper[4675]: I0216 19:43:27.945355 4675 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T19:42:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18c7d7a1f00611bdd6074314992cb6601ce0c043b831b6125010fe098c7e12ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T19:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T19:43:27Z is after 2025-08-24T17:21:41Z" Feb 16 19:43:27 crc kubenswrapper[4675]: I0216 19:43:27.958359 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:27 crc kubenswrapper[4675]: I0216 19:43:27.958416 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:27 crc kubenswrapper[4675]: I0216 19:43:27.958434 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:27 crc kubenswrapper[4675]: I0216 19:43:27.958461 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:27 crc kubenswrapper[4675]: I0216 19:43:27.958483 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:27Z","lastTransitionTime":"2026-02-16T19:43:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:28 crc kubenswrapper[4675]: I0216 19:43:28.015175 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-pj5xg" podStartSLOduration=69.015080643 podStartE2EDuration="1m9.015080643s" podCreationTimestamp="2026-02-16 19:42:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 19:43:28.014240753 +0000 UTC m=+91.139530309" watchObservedRunningTime="2026-02-16 19:43:28.015080643 +0000 UTC m=+91.140370239" Feb 16 19:43:28 crc kubenswrapper[4675]: I0216 19:43:28.064102 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:28 crc kubenswrapper[4675]: I0216 19:43:28.064156 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:28 crc kubenswrapper[4675]: I0216 19:43:28.064169 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:28 crc kubenswrapper[4675]: I0216 19:43:28.064185 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:28 crc kubenswrapper[4675]: I0216 19:43:28.064199 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:28Z","lastTransitionTime":"2026-02-16T19:43:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:28 crc kubenswrapper[4675]: I0216 19:43:28.086265 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" podStartSLOduration=69.086224358 podStartE2EDuration="1m9.086224358s" podCreationTimestamp="2026-02-16 19:42:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 19:43:28.063963422 +0000 UTC m=+91.189252978" watchObservedRunningTime="2026-02-16 19:43:28.086224358 +0000 UTC m=+91.211513954" Feb 16 19:43:28 crc kubenswrapper[4675]: I0216 19:43:28.086962 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-rg8bp" podStartSLOduration=69.086949566 podStartE2EDuration="1m9.086949566s" podCreationTimestamp="2026-02-16 19:42:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 19:43:28.086832573 +0000 UTC m=+91.212122149" watchObservedRunningTime="2026-02-16 19:43:28.086949566 +0000 UTC m=+91.212239192" Feb 16 19:43:28 crc kubenswrapper[4675]: I0216 19:43:28.116430 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=68.116406438 podStartE2EDuration="1m8.116406438s" podCreationTimestamp="2026-02-16 19:42:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 19:43:28.116225633 +0000 UTC m=+91.241515259" watchObservedRunningTime="2026-02-16 19:43:28.116406438 +0000 UTC m=+91.241695984" Feb 16 19:43:28 crc kubenswrapper[4675]: I0216 19:43:28.116941 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-4vvwh" podStartSLOduration=69.116935791 podStartE2EDuration="1m9.116935791s" podCreationTimestamp="2026-02-16 19:42:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 19:43:28.099823901 +0000 UTC m=+91.225113457" watchObservedRunningTime="2026-02-16 19:43:28.116935791 +0000 UTC m=+91.242225347" Feb 16 19:43:28 crc kubenswrapper[4675]: I0216 19:43:28.167349 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:28 crc kubenswrapper[4675]: I0216 19:43:28.167406 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:28 crc kubenswrapper[4675]: I0216 19:43:28.167421 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:28 crc kubenswrapper[4675]: I0216 19:43:28.167445 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:28 crc kubenswrapper[4675]: I0216 19:43:28.167459 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:28Z","lastTransitionTime":"2026-02-16T19:43:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:28 crc kubenswrapper[4675]: I0216 19:43:28.207325 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h9s8p" podStartSLOduration=68.207294706 podStartE2EDuration="1m8.207294706s" podCreationTimestamp="2026-02-16 19:42:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 19:43:28.207197904 +0000 UTC m=+91.332487480" watchObservedRunningTime="2026-02-16 19:43:28.207294706 +0000 UTC m=+91.332584262" Feb 16 19:43:28 crc kubenswrapper[4675]: I0216 19:43:28.207803 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-stxk6" podStartSLOduration=69.207793378 podStartE2EDuration="1m9.207793378s" podCreationTimestamp="2026-02-16 19:42:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 19:43:28.188839534 +0000 UTC m=+91.314129120" watchObservedRunningTime="2026-02-16 19:43:28.207793378 +0000 UTC m=+91.333082944" Feb 16 19:43:28 crc kubenswrapper[4675]: I0216 19:43:28.239355 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=68.239333542 podStartE2EDuration="1m8.239333542s" podCreationTimestamp="2026-02-16 19:42:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 19:43:28.238228815 +0000 UTC m=+91.363518391" watchObservedRunningTime="2026-02-16 19:43:28.239333542 +0000 UTC m=+91.364623098" Feb 16 19:43:28 crc kubenswrapper[4675]: I0216 19:43:28.252775 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=10.252750911 podStartE2EDuration="10.252750911s" podCreationTimestamp="2026-02-16 19:43:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 19:43:28.252656409 +0000 UTC m=+91.377945985" watchObservedRunningTime="2026-02-16 19:43:28.252750911 +0000 UTC m=+91.378040467" Feb 16 19:43:28 crc kubenswrapper[4675]: I0216 19:43:28.270520 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:28 crc kubenswrapper[4675]: I0216 19:43:28.270576 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:28 crc kubenswrapper[4675]: I0216 19:43:28.270590 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:28 crc kubenswrapper[4675]: I0216 19:43:28.270609 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:28 crc kubenswrapper[4675]: I0216 19:43:28.270621 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:28Z","lastTransitionTime":"2026-02-16T19:43:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:28 crc kubenswrapper[4675]: I0216 19:43:28.373085 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:28 crc kubenswrapper[4675]: I0216 19:43:28.373143 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:28 crc kubenswrapper[4675]: I0216 19:43:28.373157 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:28 crc kubenswrapper[4675]: I0216 19:43:28.373176 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:28 crc kubenswrapper[4675]: I0216 19:43:28.373189 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:28Z","lastTransitionTime":"2026-02-16T19:43:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:28 crc kubenswrapper[4675]: I0216 19:43:28.476409 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:28 crc kubenswrapper[4675]: I0216 19:43:28.476461 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:28 crc kubenswrapper[4675]: I0216 19:43:28.476470 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:28 crc kubenswrapper[4675]: I0216 19:43:28.476487 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:28 crc kubenswrapper[4675]: I0216 19:43:28.476498 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:28Z","lastTransitionTime":"2026-02-16T19:43:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:28 crc kubenswrapper[4675]: I0216 19:43:28.579481 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:28 crc kubenswrapper[4675]: I0216 19:43:28.579556 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:28 crc kubenswrapper[4675]: I0216 19:43:28.579576 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:28 crc kubenswrapper[4675]: I0216 19:43:28.579607 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:28 crc kubenswrapper[4675]: I0216 19:43:28.579633 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:28Z","lastTransitionTime":"2026-02-16T19:43:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:28 crc kubenswrapper[4675]: I0216 19:43:28.682643 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:28 crc kubenswrapper[4675]: I0216 19:43:28.682757 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:28 crc kubenswrapper[4675]: I0216 19:43:28.682779 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:28 crc kubenswrapper[4675]: I0216 19:43:28.682813 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:28 crc kubenswrapper[4675]: I0216 19:43:28.682870 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:28Z","lastTransitionTime":"2026-02-16T19:43:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:28 crc kubenswrapper[4675]: I0216 19:43:28.786603 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:28 crc kubenswrapper[4675]: I0216 19:43:28.786669 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:28 crc kubenswrapper[4675]: I0216 19:43:28.786736 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:28 crc kubenswrapper[4675]: I0216 19:43:28.786765 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:28 crc kubenswrapper[4675]: I0216 19:43:28.786824 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:28Z","lastTransitionTime":"2026-02-16T19:43:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:28 crc kubenswrapper[4675]: I0216 19:43:28.854376 4675 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 07:50:11.330879991 +0000 UTC Feb 16 19:43:28 crc kubenswrapper[4675]: I0216 19:43:28.884326 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbgjb" Feb 16 19:43:28 crc kubenswrapper[4675]: I0216 19:43:28.884544 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 19:43:28 crc kubenswrapper[4675]: I0216 19:43:28.884398 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 19:43:28 crc kubenswrapper[4675]: E0216 19:43:28.884781 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 19:43:28 crc kubenswrapper[4675]: E0216 19:43:28.884727 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbgjb" podUID="8d5a5b47-38a4-4f7e-b40e-dba4825e18be" Feb 16 19:43:28 crc kubenswrapper[4675]: I0216 19:43:28.884379 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 19:43:28 crc kubenswrapper[4675]: E0216 19:43:28.884940 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 19:43:28 crc kubenswrapper[4675]: E0216 19:43:28.885072 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 19:43:28 crc kubenswrapper[4675]: I0216 19:43:28.889619 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:28 crc kubenswrapper[4675]: I0216 19:43:28.889784 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:28 crc kubenswrapper[4675]: I0216 19:43:28.889867 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:28 crc kubenswrapper[4675]: I0216 19:43:28.889954 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:28 crc kubenswrapper[4675]: I0216 19:43:28.890038 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:28Z","lastTransitionTime":"2026-02-16T19:43:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:28 crc kubenswrapper[4675]: I0216 19:43:28.992657 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:28 crc kubenswrapper[4675]: I0216 19:43:28.993024 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:28 crc kubenswrapper[4675]: I0216 19:43:28.993102 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:28 crc kubenswrapper[4675]: I0216 19:43:28.993180 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:28 crc kubenswrapper[4675]: I0216 19:43:28.993253 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:28Z","lastTransitionTime":"2026-02-16T19:43:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:29 crc kubenswrapper[4675]: I0216 19:43:29.096881 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:29 crc kubenswrapper[4675]: I0216 19:43:29.096967 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:29 crc kubenswrapper[4675]: I0216 19:43:29.096990 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:29 crc kubenswrapper[4675]: I0216 19:43:29.097025 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:29 crc kubenswrapper[4675]: I0216 19:43:29.097049 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:29Z","lastTransitionTime":"2026-02-16T19:43:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:29 crc kubenswrapper[4675]: I0216 19:43:29.200499 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:29 crc kubenswrapper[4675]: I0216 19:43:29.200560 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:29 crc kubenswrapper[4675]: I0216 19:43:29.200580 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:29 crc kubenswrapper[4675]: I0216 19:43:29.200610 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:29 crc kubenswrapper[4675]: I0216 19:43:29.200630 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:29Z","lastTransitionTime":"2026-02-16T19:43:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:29 crc kubenswrapper[4675]: I0216 19:43:29.303586 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:29 crc kubenswrapper[4675]: I0216 19:43:29.303963 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:29 crc kubenswrapper[4675]: I0216 19:43:29.304045 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:29 crc kubenswrapper[4675]: I0216 19:43:29.304173 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:29 crc kubenswrapper[4675]: I0216 19:43:29.304265 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:29Z","lastTransitionTime":"2026-02-16T19:43:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:29 crc kubenswrapper[4675]: I0216 19:43:29.408620 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:29 crc kubenswrapper[4675]: I0216 19:43:29.409357 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:29 crc kubenswrapper[4675]: I0216 19:43:29.409578 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:29 crc kubenswrapper[4675]: I0216 19:43:29.409822 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:29 crc kubenswrapper[4675]: I0216 19:43:29.410086 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:29Z","lastTransitionTime":"2026-02-16T19:43:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:29 crc kubenswrapper[4675]: I0216 19:43:29.512850 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:29 crc kubenswrapper[4675]: I0216 19:43:29.512928 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:29 crc kubenswrapper[4675]: I0216 19:43:29.512949 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:29 crc kubenswrapper[4675]: I0216 19:43:29.513017 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:29 crc kubenswrapper[4675]: I0216 19:43:29.513031 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:29Z","lastTransitionTime":"2026-02-16T19:43:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:29 crc kubenswrapper[4675]: I0216 19:43:29.616029 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:29 crc kubenswrapper[4675]: I0216 19:43:29.616101 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:29 crc kubenswrapper[4675]: I0216 19:43:29.616126 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:29 crc kubenswrapper[4675]: I0216 19:43:29.616157 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:29 crc kubenswrapper[4675]: I0216 19:43:29.616181 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:29Z","lastTransitionTime":"2026-02-16T19:43:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:29 crc kubenswrapper[4675]: I0216 19:43:29.719300 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:29 crc kubenswrapper[4675]: I0216 19:43:29.719349 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:29 crc kubenswrapper[4675]: I0216 19:43:29.719358 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:29 crc kubenswrapper[4675]: I0216 19:43:29.719374 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:29 crc kubenswrapper[4675]: I0216 19:43:29.719385 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:29Z","lastTransitionTime":"2026-02-16T19:43:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:29 crc kubenswrapper[4675]: I0216 19:43:29.822426 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:29 crc kubenswrapper[4675]: I0216 19:43:29.822798 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:29 crc kubenswrapper[4675]: I0216 19:43:29.822904 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:29 crc kubenswrapper[4675]: I0216 19:43:29.822989 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:29 crc kubenswrapper[4675]: I0216 19:43:29.823059 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:29Z","lastTransitionTime":"2026-02-16T19:43:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:29 crc kubenswrapper[4675]: I0216 19:43:29.855059 4675 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 15:21:38.287630021 +0000 UTC Feb 16 19:43:29 crc kubenswrapper[4675]: I0216 19:43:29.927142 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:29 crc kubenswrapper[4675]: I0216 19:43:29.927483 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:29 crc kubenswrapper[4675]: I0216 19:43:29.927603 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:29 crc kubenswrapper[4675]: I0216 19:43:29.927731 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:29 crc kubenswrapper[4675]: I0216 19:43:29.928028 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:29Z","lastTransitionTime":"2026-02-16T19:43:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:30 crc kubenswrapper[4675]: I0216 19:43:30.032078 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:30 crc kubenswrapper[4675]: I0216 19:43:30.032135 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:30 crc kubenswrapper[4675]: I0216 19:43:30.032150 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:30 crc kubenswrapper[4675]: I0216 19:43:30.032173 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:30 crc kubenswrapper[4675]: I0216 19:43:30.032188 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:30Z","lastTransitionTime":"2026-02-16T19:43:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:30 crc kubenswrapper[4675]: I0216 19:43:30.135627 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:30 crc kubenswrapper[4675]: I0216 19:43:30.135738 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:30 crc kubenswrapper[4675]: I0216 19:43:30.135758 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:30 crc kubenswrapper[4675]: I0216 19:43:30.135781 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:30 crc kubenswrapper[4675]: I0216 19:43:30.135799 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:30Z","lastTransitionTime":"2026-02-16T19:43:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:30 crc kubenswrapper[4675]: I0216 19:43:30.238936 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:30 crc kubenswrapper[4675]: I0216 19:43:30.239012 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:30 crc kubenswrapper[4675]: I0216 19:43:30.239030 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:30 crc kubenswrapper[4675]: I0216 19:43:30.239055 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:30 crc kubenswrapper[4675]: I0216 19:43:30.239074 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:30Z","lastTransitionTime":"2026-02-16T19:43:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:30 crc kubenswrapper[4675]: I0216 19:43:30.341799 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:30 crc kubenswrapper[4675]: I0216 19:43:30.341866 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:30 crc kubenswrapper[4675]: I0216 19:43:30.341881 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:30 crc kubenswrapper[4675]: I0216 19:43:30.341901 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:30 crc kubenswrapper[4675]: I0216 19:43:30.341915 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:30Z","lastTransitionTime":"2026-02-16T19:43:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:30 crc kubenswrapper[4675]: I0216 19:43:30.444748 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:30 crc kubenswrapper[4675]: I0216 19:43:30.444820 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:30 crc kubenswrapper[4675]: I0216 19:43:30.444848 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:30 crc kubenswrapper[4675]: I0216 19:43:30.444882 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:30 crc kubenswrapper[4675]: I0216 19:43:30.444905 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:30Z","lastTransitionTime":"2026-02-16T19:43:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:30 crc kubenswrapper[4675]: I0216 19:43:30.548123 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:30 crc kubenswrapper[4675]: I0216 19:43:30.548183 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:30 crc kubenswrapper[4675]: I0216 19:43:30.548195 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:30 crc kubenswrapper[4675]: I0216 19:43:30.548221 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:30 crc kubenswrapper[4675]: I0216 19:43:30.548237 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:30Z","lastTransitionTime":"2026-02-16T19:43:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:30 crc kubenswrapper[4675]: I0216 19:43:30.650798 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:30 crc kubenswrapper[4675]: I0216 19:43:30.650866 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:30 crc kubenswrapper[4675]: I0216 19:43:30.650883 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:30 crc kubenswrapper[4675]: I0216 19:43:30.650907 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:30 crc kubenswrapper[4675]: I0216 19:43:30.650924 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:30Z","lastTransitionTime":"2026-02-16T19:43:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:30 crc kubenswrapper[4675]: I0216 19:43:30.753994 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:30 crc kubenswrapper[4675]: I0216 19:43:30.754080 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:30 crc kubenswrapper[4675]: I0216 19:43:30.754104 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:30 crc kubenswrapper[4675]: I0216 19:43:30.754138 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:30 crc kubenswrapper[4675]: I0216 19:43:30.754166 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:30Z","lastTransitionTime":"2026-02-16T19:43:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:30 crc kubenswrapper[4675]: I0216 19:43:30.855242 4675 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 14:09:13.736217014 +0000 UTC Feb 16 19:43:30 crc kubenswrapper[4675]: I0216 19:43:30.857094 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:30 crc kubenswrapper[4675]: I0216 19:43:30.857174 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:30 crc kubenswrapper[4675]: I0216 19:43:30.857195 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:30 crc kubenswrapper[4675]: I0216 19:43:30.857224 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:30 crc kubenswrapper[4675]: I0216 19:43:30.857244 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:30Z","lastTransitionTime":"2026-02-16T19:43:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:30 crc kubenswrapper[4675]: I0216 19:43:30.883877 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbgjb" Feb 16 19:43:30 crc kubenswrapper[4675]: I0216 19:43:30.883975 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 19:43:30 crc kubenswrapper[4675]: I0216 19:43:30.883951 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 19:43:30 crc kubenswrapper[4675]: I0216 19:43:30.883906 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 19:43:30 crc kubenswrapper[4675]: E0216 19:43:30.884155 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbgjb" podUID="8d5a5b47-38a4-4f7e-b40e-dba4825e18be" Feb 16 19:43:30 crc kubenswrapper[4675]: E0216 19:43:30.884274 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 19:43:30 crc kubenswrapper[4675]: E0216 19:43:30.884445 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 19:43:30 crc kubenswrapper[4675]: E0216 19:43:30.884530 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 19:43:30 crc kubenswrapper[4675]: I0216 19:43:30.961233 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:30 crc kubenswrapper[4675]: I0216 19:43:30.961329 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:30 crc kubenswrapper[4675]: I0216 19:43:30.961353 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:30 crc kubenswrapper[4675]: I0216 19:43:30.961384 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:30 crc kubenswrapper[4675]: I0216 19:43:30.961405 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:30Z","lastTransitionTime":"2026-02-16T19:43:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:31 crc kubenswrapper[4675]: I0216 19:43:31.064586 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:31 crc kubenswrapper[4675]: I0216 19:43:31.064679 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:31 crc kubenswrapper[4675]: I0216 19:43:31.064737 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:31 crc kubenswrapper[4675]: I0216 19:43:31.064772 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:31 crc kubenswrapper[4675]: I0216 19:43:31.064795 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:31Z","lastTransitionTime":"2026-02-16T19:43:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:31 crc kubenswrapper[4675]: I0216 19:43:31.168301 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:31 crc kubenswrapper[4675]: I0216 19:43:31.168348 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:31 crc kubenswrapper[4675]: I0216 19:43:31.168360 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:31 crc kubenswrapper[4675]: I0216 19:43:31.168376 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:31 crc kubenswrapper[4675]: I0216 19:43:31.168388 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:31Z","lastTransitionTime":"2026-02-16T19:43:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:31 crc kubenswrapper[4675]: I0216 19:43:31.272277 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:31 crc kubenswrapper[4675]: I0216 19:43:31.272356 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:31 crc kubenswrapper[4675]: I0216 19:43:31.272373 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:31 crc kubenswrapper[4675]: I0216 19:43:31.272397 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:31 crc kubenswrapper[4675]: I0216 19:43:31.272415 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:31Z","lastTransitionTime":"2026-02-16T19:43:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:31 crc kubenswrapper[4675]: I0216 19:43:31.376618 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:31 crc kubenswrapper[4675]: I0216 19:43:31.376668 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:31 crc kubenswrapper[4675]: I0216 19:43:31.376681 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:31 crc kubenswrapper[4675]: I0216 19:43:31.376725 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:31 crc kubenswrapper[4675]: I0216 19:43:31.376738 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:31Z","lastTransitionTime":"2026-02-16T19:43:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:31 crc kubenswrapper[4675]: I0216 19:43:31.479725 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:31 crc kubenswrapper[4675]: I0216 19:43:31.479782 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:31 crc kubenswrapper[4675]: I0216 19:43:31.479801 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:31 crc kubenswrapper[4675]: I0216 19:43:31.479826 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:31 crc kubenswrapper[4675]: I0216 19:43:31.479843 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:31Z","lastTransitionTime":"2026-02-16T19:43:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:31 crc kubenswrapper[4675]: I0216 19:43:31.583299 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:31 crc kubenswrapper[4675]: I0216 19:43:31.583334 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:31 crc kubenswrapper[4675]: I0216 19:43:31.583343 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:31 crc kubenswrapper[4675]: I0216 19:43:31.583359 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:31 crc kubenswrapper[4675]: I0216 19:43:31.583370 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:31Z","lastTransitionTime":"2026-02-16T19:43:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:31 crc kubenswrapper[4675]: I0216 19:43:31.686218 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:31 crc kubenswrapper[4675]: I0216 19:43:31.686258 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:31 crc kubenswrapper[4675]: I0216 19:43:31.686268 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:31 crc kubenswrapper[4675]: I0216 19:43:31.686283 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:31 crc kubenswrapper[4675]: I0216 19:43:31.686295 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:31Z","lastTransitionTime":"2026-02-16T19:43:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:31 crc kubenswrapper[4675]: I0216 19:43:31.788493 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:31 crc kubenswrapper[4675]: I0216 19:43:31.788524 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:31 crc kubenswrapper[4675]: I0216 19:43:31.788533 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:31 crc kubenswrapper[4675]: I0216 19:43:31.788547 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:31 crc kubenswrapper[4675]: I0216 19:43:31.788556 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:31Z","lastTransitionTime":"2026-02-16T19:43:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:31 crc kubenswrapper[4675]: I0216 19:43:31.856174 4675 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 20:59:39.090843246 +0000 UTC Feb 16 19:43:31 crc kubenswrapper[4675]: I0216 19:43:31.890460 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:31 crc kubenswrapper[4675]: I0216 19:43:31.890540 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:31 crc kubenswrapper[4675]: I0216 19:43:31.890564 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:31 crc kubenswrapper[4675]: I0216 19:43:31.890633 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:31 crc kubenswrapper[4675]: I0216 19:43:31.890657 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:31Z","lastTransitionTime":"2026-02-16T19:43:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:31 crc kubenswrapper[4675]: I0216 19:43:31.993834 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:31 crc kubenswrapper[4675]: I0216 19:43:31.993900 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:31 crc kubenswrapper[4675]: I0216 19:43:31.993917 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:31 crc kubenswrapper[4675]: I0216 19:43:31.993938 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:31 crc kubenswrapper[4675]: I0216 19:43:31.993955 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:31Z","lastTransitionTime":"2026-02-16T19:43:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:32 crc kubenswrapper[4675]: I0216 19:43:32.096962 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:32 crc kubenswrapper[4675]: I0216 19:43:32.097005 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:32 crc kubenswrapper[4675]: I0216 19:43:32.097016 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:32 crc kubenswrapper[4675]: I0216 19:43:32.097036 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:32 crc kubenswrapper[4675]: I0216 19:43:32.097053 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:32Z","lastTransitionTime":"2026-02-16T19:43:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:32 crc kubenswrapper[4675]: I0216 19:43:32.200074 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:32 crc kubenswrapper[4675]: I0216 19:43:32.200116 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:32 crc kubenswrapper[4675]: I0216 19:43:32.200130 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:32 crc kubenswrapper[4675]: I0216 19:43:32.200161 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:32 crc kubenswrapper[4675]: I0216 19:43:32.200181 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:32Z","lastTransitionTime":"2026-02-16T19:43:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:32 crc kubenswrapper[4675]: I0216 19:43:32.303440 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:32 crc kubenswrapper[4675]: I0216 19:43:32.303478 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:32 crc kubenswrapper[4675]: I0216 19:43:32.303487 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:32 crc kubenswrapper[4675]: I0216 19:43:32.303506 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:32 crc kubenswrapper[4675]: I0216 19:43:32.303516 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:32Z","lastTransitionTime":"2026-02-16T19:43:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:32 crc kubenswrapper[4675]: I0216 19:43:32.406986 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:32 crc kubenswrapper[4675]: I0216 19:43:32.407022 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:32 crc kubenswrapper[4675]: I0216 19:43:32.407033 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:32 crc kubenswrapper[4675]: I0216 19:43:32.407049 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:32 crc kubenswrapper[4675]: I0216 19:43:32.407060 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:32Z","lastTransitionTime":"2026-02-16T19:43:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:32 crc kubenswrapper[4675]: I0216 19:43:32.510243 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:32 crc kubenswrapper[4675]: I0216 19:43:32.510305 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:32 crc kubenswrapper[4675]: I0216 19:43:32.510322 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:32 crc kubenswrapper[4675]: I0216 19:43:32.510347 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:32 crc kubenswrapper[4675]: I0216 19:43:32.510364 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:32Z","lastTransitionTime":"2026-02-16T19:43:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:32 crc kubenswrapper[4675]: I0216 19:43:32.613019 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:32 crc kubenswrapper[4675]: I0216 19:43:32.613857 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:32 crc kubenswrapper[4675]: I0216 19:43:32.613922 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:32 crc kubenswrapper[4675]: I0216 19:43:32.613957 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:32 crc kubenswrapper[4675]: I0216 19:43:32.613984 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:32Z","lastTransitionTime":"2026-02-16T19:43:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:32 crc kubenswrapper[4675]: I0216 19:43:32.717664 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:32 crc kubenswrapper[4675]: I0216 19:43:32.717727 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:32 crc kubenswrapper[4675]: I0216 19:43:32.717739 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:32 crc kubenswrapper[4675]: I0216 19:43:32.717758 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:32 crc kubenswrapper[4675]: I0216 19:43:32.717768 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:32Z","lastTransitionTime":"2026-02-16T19:43:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:32 crc kubenswrapper[4675]: I0216 19:43:32.821806 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:32 crc kubenswrapper[4675]: I0216 19:43:32.821888 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:32 crc kubenswrapper[4675]: I0216 19:43:32.821906 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:32 crc kubenswrapper[4675]: I0216 19:43:32.821935 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:32 crc kubenswrapper[4675]: I0216 19:43:32.821955 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:32Z","lastTransitionTime":"2026-02-16T19:43:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:32 crc kubenswrapper[4675]: I0216 19:43:32.857192 4675 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 07:13:32.180430999 +0000 UTC Feb 16 19:43:32 crc kubenswrapper[4675]: I0216 19:43:32.883706 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbgjb" Feb 16 19:43:32 crc kubenswrapper[4675]: I0216 19:43:32.883743 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 19:43:32 crc kubenswrapper[4675]: I0216 19:43:32.883787 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 19:43:32 crc kubenswrapper[4675]: I0216 19:43:32.883743 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 19:43:32 crc kubenswrapper[4675]: E0216 19:43:32.883913 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbgjb" podUID="8d5a5b47-38a4-4f7e-b40e-dba4825e18be" Feb 16 19:43:32 crc kubenswrapper[4675]: E0216 19:43:32.883966 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 19:43:32 crc kubenswrapper[4675]: E0216 19:43:32.884083 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 19:43:32 crc kubenswrapper[4675]: E0216 19:43:32.884177 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 19:43:32 crc kubenswrapper[4675]: I0216 19:43:32.925282 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:32 crc kubenswrapper[4675]: I0216 19:43:32.925316 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:32 crc kubenswrapper[4675]: I0216 19:43:32.925326 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:32 crc kubenswrapper[4675]: I0216 19:43:32.925345 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:32 crc kubenswrapper[4675]: I0216 19:43:32.925359 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:32Z","lastTransitionTime":"2026-02-16T19:43:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:33 crc kubenswrapper[4675]: I0216 19:43:33.028768 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:33 crc kubenswrapper[4675]: I0216 19:43:33.028812 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:33 crc kubenswrapper[4675]: I0216 19:43:33.028822 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:33 crc kubenswrapper[4675]: I0216 19:43:33.028839 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:33 crc kubenswrapper[4675]: I0216 19:43:33.028852 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:33Z","lastTransitionTime":"2026-02-16T19:43:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:33 crc kubenswrapper[4675]: I0216 19:43:33.131367 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:33 crc kubenswrapper[4675]: I0216 19:43:33.131418 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:33 crc kubenswrapper[4675]: I0216 19:43:33.131431 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:33 crc kubenswrapper[4675]: I0216 19:43:33.131452 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:33 crc kubenswrapper[4675]: I0216 19:43:33.131465 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:33Z","lastTransitionTime":"2026-02-16T19:43:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:33 crc kubenswrapper[4675]: I0216 19:43:33.234825 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:33 crc kubenswrapper[4675]: I0216 19:43:33.234877 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:33 crc kubenswrapper[4675]: I0216 19:43:33.234895 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:33 crc kubenswrapper[4675]: I0216 19:43:33.234921 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:33 crc kubenswrapper[4675]: I0216 19:43:33.234940 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:33Z","lastTransitionTime":"2026-02-16T19:43:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:33 crc kubenswrapper[4675]: I0216 19:43:33.341519 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:33 crc kubenswrapper[4675]: I0216 19:43:33.341591 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:33 crc kubenswrapper[4675]: I0216 19:43:33.341616 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:33 crc kubenswrapper[4675]: I0216 19:43:33.341653 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:33 crc kubenswrapper[4675]: I0216 19:43:33.341676 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:33Z","lastTransitionTime":"2026-02-16T19:43:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:33 crc kubenswrapper[4675]: I0216 19:43:33.444612 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:33 crc kubenswrapper[4675]: I0216 19:43:33.445169 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:33 crc kubenswrapper[4675]: I0216 19:43:33.445196 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:33 crc kubenswrapper[4675]: I0216 19:43:33.445221 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:33 crc kubenswrapper[4675]: I0216 19:43:33.445236 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:33Z","lastTransitionTime":"2026-02-16T19:43:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:33 crc kubenswrapper[4675]: I0216 19:43:33.548609 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:33 crc kubenswrapper[4675]: I0216 19:43:33.549747 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:33 crc kubenswrapper[4675]: I0216 19:43:33.549869 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:33 crc kubenswrapper[4675]: I0216 19:43:33.549924 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:33 crc kubenswrapper[4675]: I0216 19:43:33.549949 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:33Z","lastTransitionTime":"2026-02-16T19:43:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:33 crc kubenswrapper[4675]: I0216 19:43:33.652871 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:33 crc kubenswrapper[4675]: I0216 19:43:33.652914 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:33 crc kubenswrapper[4675]: I0216 19:43:33.652934 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:33 crc kubenswrapper[4675]: I0216 19:43:33.652954 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:33 crc kubenswrapper[4675]: I0216 19:43:33.652967 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:33Z","lastTransitionTime":"2026-02-16T19:43:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:33 crc kubenswrapper[4675]: I0216 19:43:33.756207 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:33 crc kubenswrapper[4675]: I0216 19:43:33.756255 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:33 crc kubenswrapper[4675]: I0216 19:43:33.756269 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:33 crc kubenswrapper[4675]: I0216 19:43:33.756292 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:33 crc kubenswrapper[4675]: I0216 19:43:33.756306 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:33Z","lastTransitionTime":"2026-02-16T19:43:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:33 crc kubenswrapper[4675]: I0216 19:43:33.857371 4675 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-11 08:24:55.957915164 +0000 UTC Feb 16 19:43:33 crc kubenswrapper[4675]: I0216 19:43:33.859397 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:33 crc kubenswrapper[4675]: I0216 19:43:33.859468 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:33 crc kubenswrapper[4675]: I0216 19:43:33.859498 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:33 crc kubenswrapper[4675]: I0216 19:43:33.859524 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:33 crc kubenswrapper[4675]: I0216 19:43:33.859541 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:33Z","lastTransitionTime":"2026-02-16T19:43:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:33 crc kubenswrapper[4675]: I0216 19:43:33.962592 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:33 crc kubenswrapper[4675]: I0216 19:43:33.962633 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:33 crc kubenswrapper[4675]: I0216 19:43:33.962644 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:33 crc kubenswrapper[4675]: I0216 19:43:33.962664 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:33 crc kubenswrapper[4675]: I0216 19:43:33.962677 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:33Z","lastTransitionTime":"2026-02-16T19:43:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:34 crc kubenswrapper[4675]: I0216 19:43:34.065412 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:34 crc kubenswrapper[4675]: I0216 19:43:34.065473 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:34 crc kubenswrapper[4675]: I0216 19:43:34.065484 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:34 crc kubenswrapper[4675]: I0216 19:43:34.065506 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:34 crc kubenswrapper[4675]: I0216 19:43:34.065519 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:34Z","lastTransitionTime":"2026-02-16T19:43:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:34 crc kubenswrapper[4675]: I0216 19:43:34.168980 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:34 crc kubenswrapper[4675]: I0216 19:43:34.169049 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:34 crc kubenswrapper[4675]: I0216 19:43:34.169067 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:34 crc kubenswrapper[4675]: I0216 19:43:34.169094 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:34 crc kubenswrapper[4675]: I0216 19:43:34.169113 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:34Z","lastTransitionTime":"2026-02-16T19:43:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:34 crc kubenswrapper[4675]: I0216 19:43:34.272417 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:34 crc kubenswrapper[4675]: I0216 19:43:34.272465 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:34 crc kubenswrapper[4675]: I0216 19:43:34.272478 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:34 crc kubenswrapper[4675]: I0216 19:43:34.272495 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:34 crc kubenswrapper[4675]: I0216 19:43:34.272507 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:34Z","lastTransitionTime":"2026-02-16T19:43:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:34 crc kubenswrapper[4675]: I0216 19:43:34.375462 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:34 crc kubenswrapper[4675]: I0216 19:43:34.375515 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:34 crc kubenswrapper[4675]: I0216 19:43:34.375529 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:34 crc kubenswrapper[4675]: I0216 19:43:34.375548 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:34 crc kubenswrapper[4675]: I0216 19:43:34.375558 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:34Z","lastTransitionTime":"2026-02-16T19:43:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:34 crc kubenswrapper[4675]: I0216 19:43:34.478660 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:34 crc kubenswrapper[4675]: I0216 19:43:34.478718 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:34 crc kubenswrapper[4675]: I0216 19:43:34.478730 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:34 crc kubenswrapper[4675]: I0216 19:43:34.478750 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:34 crc kubenswrapper[4675]: I0216 19:43:34.478762 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:34Z","lastTransitionTime":"2026-02-16T19:43:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:34 crc kubenswrapper[4675]: I0216 19:43:34.581985 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:34 crc kubenswrapper[4675]: I0216 19:43:34.582040 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:34 crc kubenswrapper[4675]: I0216 19:43:34.582050 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:34 crc kubenswrapper[4675]: I0216 19:43:34.582069 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:34 crc kubenswrapper[4675]: I0216 19:43:34.582082 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:34Z","lastTransitionTime":"2026-02-16T19:43:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:34 crc kubenswrapper[4675]: I0216 19:43:34.685033 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:34 crc kubenswrapper[4675]: I0216 19:43:34.685105 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:34 crc kubenswrapper[4675]: I0216 19:43:34.685124 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:34 crc kubenswrapper[4675]: I0216 19:43:34.685153 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:34 crc kubenswrapper[4675]: I0216 19:43:34.685171 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:34Z","lastTransitionTime":"2026-02-16T19:43:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:34 crc kubenswrapper[4675]: I0216 19:43:34.789246 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:34 crc kubenswrapper[4675]: I0216 19:43:34.789389 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:34 crc kubenswrapper[4675]: I0216 19:43:34.789410 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:34 crc kubenswrapper[4675]: I0216 19:43:34.789445 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:34 crc kubenswrapper[4675]: I0216 19:43:34.789467 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:34Z","lastTransitionTime":"2026-02-16T19:43:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:34 crc kubenswrapper[4675]: I0216 19:43:34.858090 4675 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 10:00:30.238361112 +0000 UTC Feb 16 19:43:34 crc kubenswrapper[4675]: I0216 19:43:34.883666 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 19:43:34 crc kubenswrapper[4675]: I0216 19:43:34.883740 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 19:43:34 crc kubenswrapper[4675]: I0216 19:43:34.883751 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 19:43:34 crc kubenswrapper[4675]: I0216 19:43:34.883859 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbgjb" Feb 16 19:43:34 crc kubenswrapper[4675]: E0216 19:43:34.884743 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 19:43:34 crc kubenswrapper[4675]: E0216 19:43:34.884953 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 19:43:34 crc kubenswrapper[4675]: E0216 19:43:34.885204 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 19:43:34 crc kubenswrapper[4675]: E0216 19:43:34.885341 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbgjb" podUID="8d5a5b47-38a4-4f7e-b40e-dba4825e18be" Feb 16 19:43:34 crc kubenswrapper[4675]: I0216 19:43:34.893096 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:34 crc kubenswrapper[4675]: I0216 19:43:34.893136 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:34 crc kubenswrapper[4675]: I0216 19:43:34.893153 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:34 crc kubenswrapper[4675]: I0216 19:43:34.893176 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:34 crc kubenswrapper[4675]: I0216 19:43:34.893193 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:34Z","lastTransitionTime":"2026-02-16T19:43:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:34 crc kubenswrapper[4675]: I0216 19:43:34.996165 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:34 crc kubenswrapper[4675]: I0216 19:43:34.996256 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:34 crc kubenswrapper[4675]: I0216 19:43:34.996276 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:34 crc kubenswrapper[4675]: I0216 19:43:34.996303 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:34 crc kubenswrapper[4675]: I0216 19:43:34.996319 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:34Z","lastTransitionTime":"2026-02-16T19:43:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:35 crc kubenswrapper[4675]: I0216 19:43:35.099097 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:35 crc kubenswrapper[4675]: I0216 19:43:35.099175 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:35 crc kubenswrapper[4675]: I0216 19:43:35.099200 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:35 crc kubenswrapper[4675]: I0216 19:43:35.099233 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:35 crc kubenswrapper[4675]: I0216 19:43:35.099259 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:35Z","lastTransitionTime":"2026-02-16T19:43:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:35 crc kubenswrapper[4675]: I0216 19:43:35.203262 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:35 crc kubenswrapper[4675]: I0216 19:43:35.203312 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:35 crc kubenswrapper[4675]: I0216 19:43:35.203322 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:35 crc kubenswrapper[4675]: I0216 19:43:35.203339 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:35 crc kubenswrapper[4675]: I0216 19:43:35.203351 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:35Z","lastTransitionTime":"2026-02-16T19:43:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:35 crc kubenswrapper[4675]: I0216 19:43:35.307015 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:35 crc kubenswrapper[4675]: I0216 19:43:35.307072 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:35 crc kubenswrapper[4675]: I0216 19:43:35.307085 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:35 crc kubenswrapper[4675]: I0216 19:43:35.307115 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:35 crc kubenswrapper[4675]: I0216 19:43:35.307144 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:35Z","lastTransitionTime":"2026-02-16T19:43:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:35 crc kubenswrapper[4675]: I0216 19:43:35.411163 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:35 crc kubenswrapper[4675]: I0216 19:43:35.411226 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:35 crc kubenswrapper[4675]: I0216 19:43:35.411245 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:35 crc kubenswrapper[4675]: I0216 19:43:35.411275 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:35 crc kubenswrapper[4675]: I0216 19:43:35.411296 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:35Z","lastTransitionTime":"2026-02-16T19:43:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:35 crc kubenswrapper[4675]: I0216 19:43:35.515393 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:35 crc kubenswrapper[4675]: I0216 19:43:35.515442 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:35 crc kubenswrapper[4675]: I0216 19:43:35.515455 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:35 crc kubenswrapper[4675]: I0216 19:43:35.515475 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:35 crc kubenswrapper[4675]: I0216 19:43:35.515488 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:35Z","lastTransitionTime":"2026-02-16T19:43:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:35 crc kubenswrapper[4675]: I0216 19:43:35.618809 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:35 crc kubenswrapper[4675]: I0216 19:43:35.618880 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:35 crc kubenswrapper[4675]: I0216 19:43:35.618899 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:35 crc kubenswrapper[4675]: I0216 19:43:35.618925 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:35 crc kubenswrapper[4675]: I0216 19:43:35.618944 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:35Z","lastTransitionTime":"2026-02-16T19:43:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:35 crc kubenswrapper[4675]: I0216 19:43:35.722681 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:35 crc kubenswrapper[4675]: I0216 19:43:35.722812 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:35 crc kubenswrapper[4675]: I0216 19:43:35.722834 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:35 crc kubenswrapper[4675]: I0216 19:43:35.722863 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:35 crc kubenswrapper[4675]: I0216 19:43:35.722880 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:35Z","lastTransitionTime":"2026-02-16T19:43:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:35 crc kubenswrapper[4675]: I0216 19:43:35.825396 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:35 crc kubenswrapper[4675]: I0216 19:43:35.825467 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:35 crc kubenswrapper[4675]: I0216 19:43:35.825485 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:35 crc kubenswrapper[4675]: I0216 19:43:35.825511 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:35 crc kubenswrapper[4675]: I0216 19:43:35.825530 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:35Z","lastTransitionTime":"2026-02-16T19:43:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:35 crc kubenswrapper[4675]: I0216 19:43:35.859094 4675 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 18:23:29.035150105 +0000 UTC Feb 16 19:43:35 crc kubenswrapper[4675]: I0216 19:43:35.928000 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:35 crc kubenswrapper[4675]: I0216 19:43:35.928057 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:35 crc kubenswrapper[4675]: I0216 19:43:35.928079 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:35 crc kubenswrapper[4675]: I0216 19:43:35.928108 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:35 crc kubenswrapper[4675]: I0216 19:43:35.928132 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:35Z","lastTransitionTime":"2026-02-16T19:43:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:36 crc kubenswrapper[4675]: I0216 19:43:36.031484 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:36 crc kubenswrapper[4675]: I0216 19:43:36.031549 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:36 crc kubenswrapper[4675]: I0216 19:43:36.031567 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:36 crc kubenswrapper[4675]: I0216 19:43:36.031593 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:36 crc kubenswrapper[4675]: I0216 19:43:36.031612 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:36Z","lastTransitionTime":"2026-02-16T19:43:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:36 crc kubenswrapper[4675]: I0216 19:43:36.134970 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:36 crc kubenswrapper[4675]: I0216 19:43:36.135053 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:36 crc kubenswrapper[4675]: I0216 19:43:36.135080 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:36 crc kubenswrapper[4675]: I0216 19:43:36.135116 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:36 crc kubenswrapper[4675]: I0216 19:43:36.135141 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:36Z","lastTransitionTime":"2026-02-16T19:43:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:36 crc kubenswrapper[4675]: I0216 19:43:36.238037 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:36 crc kubenswrapper[4675]: I0216 19:43:36.238088 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:36 crc kubenswrapper[4675]: I0216 19:43:36.238098 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:36 crc kubenswrapper[4675]: I0216 19:43:36.238117 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:36 crc kubenswrapper[4675]: I0216 19:43:36.238128 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:36Z","lastTransitionTime":"2026-02-16T19:43:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:36 crc kubenswrapper[4675]: I0216 19:43:36.341304 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:36 crc kubenswrapper[4675]: I0216 19:43:36.341399 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:36 crc kubenswrapper[4675]: I0216 19:43:36.341413 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:36 crc kubenswrapper[4675]: I0216 19:43:36.341441 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:36 crc kubenswrapper[4675]: I0216 19:43:36.341455 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:36Z","lastTransitionTime":"2026-02-16T19:43:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:36 crc kubenswrapper[4675]: I0216 19:43:36.444853 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:36 crc kubenswrapper[4675]: I0216 19:43:36.444924 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:36 crc kubenswrapper[4675]: I0216 19:43:36.444942 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:36 crc kubenswrapper[4675]: I0216 19:43:36.445005 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:36 crc kubenswrapper[4675]: I0216 19:43:36.445025 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:36Z","lastTransitionTime":"2026-02-16T19:43:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:36 crc kubenswrapper[4675]: I0216 19:43:36.547467 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:36 crc kubenswrapper[4675]: I0216 19:43:36.547506 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:36 crc kubenswrapper[4675]: I0216 19:43:36.547516 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:36 crc kubenswrapper[4675]: I0216 19:43:36.547535 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:36 crc kubenswrapper[4675]: I0216 19:43:36.547546 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:36Z","lastTransitionTime":"2026-02-16T19:43:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:36 crc kubenswrapper[4675]: I0216 19:43:36.651005 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:36 crc kubenswrapper[4675]: I0216 19:43:36.651069 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:36 crc kubenswrapper[4675]: I0216 19:43:36.651081 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:36 crc kubenswrapper[4675]: I0216 19:43:36.651105 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:36 crc kubenswrapper[4675]: I0216 19:43:36.651124 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:36Z","lastTransitionTime":"2026-02-16T19:43:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:36 crc kubenswrapper[4675]: I0216 19:43:36.753600 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:36 crc kubenswrapper[4675]: I0216 19:43:36.753666 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:36 crc kubenswrapper[4675]: I0216 19:43:36.753681 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:36 crc kubenswrapper[4675]: I0216 19:43:36.753755 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:36 crc kubenswrapper[4675]: I0216 19:43:36.753774 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:36Z","lastTransitionTime":"2026-02-16T19:43:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:36 crc kubenswrapper[4675]: I0216 19:43:36.857398 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:36 crc kubenswrapper[4675]: I0216 19:43:36.857470 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:36 crc kubenswrapper[4675]: I0216 19:43:36.857482 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:36 crc kubenswrapper[4675]: I0216 19:43:36.857500 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:36 crc kubenswrapper[4675]: I0216 19:43:36.857513 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:36Z","lastTransitionTime":"2026-02-16T19:43:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:36 crc kubenswrapper[4675]: I0216 19:43:36.859584 4675 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 04:47:24.428529801 +0000 UTC Feb 16 19:43:36 crc kubenswrapper[4675]: I0216 19:43:36.884082 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 19:43:36 crc kubenswrapper[4675]: I0216 19:43:36.884148 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 19:43:36 crc kubenswrapper[4675]: I0216 19:43:36.884092 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbgjb" Feb 16 19:43:36 crc kubenswrapper[4675]: I0216 19:43:36.884184 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 19:43:36 crc kubenswrapper[4675]: E0216 19:43:36.884268 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 19:43:36 crc kubenswrapper[4675]: E0216 19:43:36.884439 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbgjb" podUID="8d5a5b47-38a4-4f7e-b40e-dba4825e18be" Feb 16 19:43:36 crc kubenswrapper[4675]: E0216 19:43:36.884733 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 19:43:36 crc kubenswrapper[4675]: E0216 19:43:36.884781 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 19:43:36 crc kubenswrapper[4675]: I0216 19:43:36.963072 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:36 crc kubenswrapper[4675]: I0216 19:43:36.963169 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:36 crc kubenswrapper[4675]: I0216 19:43:36.963195 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:36 crc kubenswrapper[4675]: I0216 19:43:36.963236 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:36 crc kubenswrapper[4675]: I0216 19:43:36.963459 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:36Z","lastTransitionTime":"2026-02-16T19:43:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:37 crc kubenswrapper[4675]: I0216 19:43:37.067952 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:37 crc kubenswrapper[4675]: I0216 19:43:37.068046 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:37 crc kubenswrapper[4675]: I0216 19:43:37.068075 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:37 crc kubenswrapper[4675]: I0216 19:43:37.068106 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:37 crc kubenswrapper[4675]: I0216 19:43:37.068127 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:37Z","lastTransitionTime":"2026-02-16T19:43:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:37 crc kubenswrapper[4675]: I0216 19:43:37.170966 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:37 crc kubenswrapper[4675]: I0216 19:43:37.171023 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:37 crc kubenswrapper[4675]: I0216 19:43:37.171040 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:37 crc kubenswrapper[4675]: I0216 19:43:37.171066 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:37 crc kubenswrapper[4675]: I0216 19:43:37.171082 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:37Z","lastTransitionTime":"2026-02-16T19:43:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:37 crc kubenswrapper[4675]: I0216 19:43:37.215562 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 19:43:37 crc kubenswrapper[4675]: I0216 19:43:37.215593 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 19:43:37 crc kubenswrapper[4675]: I0216 19:43:37.215605 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 19:43:37 crc kubenswrapper[4675]: I0216 19:43:37.215620 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 19:43:37 crc kubenswrapper[4675]: I0216 19:43:37.215631 4675 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T19:43:37Z","lastTransitionTime":"2026-02-16T19:43:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 19:43:37 crc kubenswrapper[4675]: I0216 19:43:37.275303 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-kzjhv"] Feb 16 19:43:37 crc kubenswrapper[4675]: I0216 19:43:37.276741 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kzjhv" Feb 16 19:43:37 crc kubenswrapper[4675]: I0216 19:43:37.279618 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 16 19:43:37 crc kubenswrapper[4675]: I0216 19:43:37.280027 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 16 19:43:37 crc kubenswrapper[4675]: I0216 19:43:37.280161 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 16 19:43:37 crc kubenswrapper[4675]: I0216 19:43:37.280395 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 16 19:43:37 crc kubenswrapper[4675]: I0216 19:43:37.332196 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=45.33216847 podStartE2EDuration="45.33216847s" podCreationTimestamp="2026-02-16 19:42:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 19:43:37.330376326 +0000 UTC m=+100.455665912" watchObservedRunningTime="2026-02-16 19:43:37.33216847 +0000 UTC m=+100.457458036" Feb 16 19:43:37 crc kubenswrapper[4675]: I0216 19:43:37.332463 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=81.332457297 podStartE2EDuration="1m21.332457297s" podCreationTimestamp="2026-02-16 19:42:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 19:43:37.315118452 +0000 UTC m=+100.440408068" watchObservedRunningTime="2026-02-16 19:43:37.332457297 +0000 UTC m=+100.457746863" Feb 16 19:43:37 crc kubenswrapper[4675]: I0216 19:43:37.362750 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/599fcdd9-3e8e-40f1-b17c-baa3337b82fc-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-kzjhv\" (UID: \"599fcdd9-3e8e-40f1-b17c-baa3337b82fc\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kzjhv" Feb 16 19:43:37 crc kubenswrapper[4675]: I0216 19:43:37.363060 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/599fcdd9-3e8e-40f1-b17c-baa3337b82fc-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-kzjhv\" (UID: \"599fcdd9-3e8e-40f1-b17c-baa3337b82fc\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kzjhv" Feb 16 19:43:37 crc kubenswrapper[4675]: I0216 19:43:37.363297 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/599fcdd9-3e8e-40f1-b17c-baa3337b82fc-service-ca\") pod \"cluster-version-operator-5c965bbfc6-kzjhv\" (UID: \"599fcdd9-3e8e-40f1-b17c-baa3337b82fc\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kzjhv" Feb 16 19:43:37 crc kubenswrapper[4675]: I0216 19:43:37.363385 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/599fcdd9-3e8e-40f1-b17c-baa3337b82fc-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-kzjhv\" (UID: \"599fcdd9-3e8e-40f1-b17c-baa3337b82fc\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kzjhv" Feb 16 19:43:37 crc kubenswrapper[4675]: I0216 19:43:37.363408 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/599fcdd9-3e8e-40f1-b17c-baa3337b82fc-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-kzjhv\" (UID: \"599fcdd9-3e8e-40f1-b17c-baa3337b82fc\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kzjhv" Feb 16 19:43:37 crc kubenswrapper[4675]: I0216 19:43:37.465178 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/599fcdd9-3e8e-40f1-b17c-baa3337b82fc-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-kzjhv\" (UID: \"599fcdd9-3e8e-40f1-b17c-baa3337b82fc\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kzjhv" Feb 16 19:43:37 crc kubenswrapper[4675]: I0216 19:43:37.465291 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/599fcdd9-3e8e-40f1-b17c-baa3337b82fc-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-kzjhv\" (UID: \"599fcdd9-3e8e-40f1-b17c-baa3337b82fc\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kzjhv" Feb 16 19:43:37 crc kubenswrapper[4675]: I0216 19:43:37.465322 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/599fcdd9-3e8e-40f1-b17c-baa3337b82fc-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-kzjhv\" (UID: \"599fcdd9-3e8e-40f1-b17c-baa3337b82fc\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kzjhv" Feb 16 19:43:37 crc kubenswrapper[4675]: I0216 19:43:37.465441 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/599fcdd9-3e8e-40f1-b17c-baa3337b82fc-service-ca\") pod \"cluster-version-operator-5c965bbfc6-kzjhv\" (UID: \"599fcdd9-3e8e-40f1-b17c-baa3337b82fc\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kzjhv" Feb 16 19:43:37 crc kubenswrapper[4675]: I0216 19:43:37.465602 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/599fcdd9-3e8e-40f1-b17c-baa3337b82fc-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-kzjhv\" (UID: \"599fcdd9-3e8e-40f1-b17c-baa3337b82fc\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kzjhv" Feb 16 19:43:37 crc kubenswrapper[4675]: I0216 19:43:37.465635 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/599fcdd9-3e8e-40f1-b17c-baa3337b82fc-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-kzjhv\" (UID: \"599fcdd9-3e8e-40f1-b17c-baa3337b82fc\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kzjhv" Feb 16 19:43:37 crc kubenswrapper[4675]: I0216 19:43:37.465835 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/599fcdd9-3e8e-40f1-b17c-baa3337b82fc-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-kzjhv\" (UID: \"599fcdd9-3e8e-40f1-b17c-baa3337b82fc\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kzjhv" Feb 16 19:43:37 crc kubenswrapper[4675]: I0216 19:43:37.466979 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/599fcdd9-3e8e-40f1-b17c-baa3337b82fc-service-ca\") pod \"cluster-version-operator-5c965bbfc6-kzjhv\" (UID: \"599fcdd9-3e8e-40f1-b17c-baa3337b82fc\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kzjhv" Feb 16 19:43:37 crc kubenswrapper[4675]: I0216 19:43:37.473906 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/599fcdd9-3e8e-40f1-b17c-baa3337b82fc-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-kzjhv\" (UID: \"599fcdd9-3e8e-40f1-b17c-baa3337b82fc\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kzjhv" Feb 16 19:43:37 crc kubenswrapper[4675]: I0216 19:43:37.485259 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/599fcdd9-3e8e-40f1-b17c-baa3337b82fc-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-kzjhv\" (UID: \"599fcdd9-3e8e-40f1-b17c-baa3337b82fc\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kzjhv" Feb 16 19:43:37 crc kubenswrapper[4675]: I0216 19:43:37.594518 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kzjhv" Feb 16 19:43:37 crc kubenswrapper[4675]: I0216 19:43:37.860742 4675 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 13:09:27.536464501 +0000 UTC Feb 16 19:43:37 crc kubenswrapper[4675]: I0216 19:43:37.860834 4675 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Feb 16 19:43:37 crc kubenswrapper[4675]: I0216 19:43:37.871744 4675 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 16 19:43:38 crc kubenswrapper[4675]: I0216 19:43:38.514443 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kzjhv" event={"ID":"599fcdd9-3e8e-40f1-b17c-baa3337b82fc","Type":"ContainerStarted","Data":"8792f0e7e31392a207e9934eb6c2250cb6cf0637f3f47ef148ccf034e12a1620"} Feb 16 19:43:38 crc kubenswrapper[4675]: I0216 19:43:38.514522 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kzjhv" event={"ID":"599fcdd9-3e8e-40f1-b17c-baa3337b82fc","Type":"ContainerStarted","Data":"5ac0330c477aa0af12a578ffce3252c0cae68227f81bdc874e2e5ff0c15abca0"} Feb 16 19:43:38 crc kubenswrapper[4675]: I0216 19:43:38.531391 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kzjhv" podStartSLOduration=79.531365703 podStartE2EDuration="1m19.531365703s" podCreationTimestamp="2026-02-16 19:42:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 19:43:38.530631915 +0000 UTC m=+101.655921511" watchObservedRunningTime="2026-02-16 19:43:38.531365703 +0000 UTC m=+101.656655269" Feb 16 19:43:38 crc kubenswrapper[4675]: I0216 19:43:38.883613 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 19:43:38 crc kubenswrapper[4675]: I0216 19:43:38.883613 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbgjb" Feb 16 19:43:38 crc kubenswrapper[4675]: I0216 19:43:38.884053 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 19:43:38 crc kubenswrapper[4675]: I0216 19:43:38.884054 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 19:43:38 crc kubenswrapper[4675]: E0216 19:43:38.884194 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 19:43:38 crc kubenswrapper[4675]: E0216 19:43:38.884351 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 19:43:38 crc kubenswrapper[4675]: E0216 19:43:38.884516 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 19:43:38 crc kubenswrapper[4675]: E0216 19:43:38.884952 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbgjb" podUID="8d5a5b47-38a4-4f7e-b40e-dba4825e18be" Feb 16 19:43:38 crc kubenswrapper[4675]: I0216 19:43:38.885258 4675 scope.go:117] "RemoveContainer" containerID="07781eeef14d0ee02a533c5f1b5b7bdb9fa735d2a02636e9671605e69e6a932c" Feb 16 19:43:38 crc kubenswrapper[4675]: E0216 19:43:38.885423 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-gpc5z_openshift-ovn-kubernetes(9b6e2d5a-0472-425b-b5b4-0b94f14ebfba)\"" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" podUID="9b6e2d5a-0472-425b-b5b4-0b94f14ebfba" Feb 16 19:43:39 crc kubenswrapper[4675]: I0216 19:43:39.187261 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8d5a5b47-38a4-4f7e-b40e-dba4825e18be-metrics-certs\") pod \"network-metrics-daemon-sbgjb\" (UID: \"8d5a5b47-38a4-4f7e-b40e-dba4825e18be\") " pod="openshift-multus/network-metrics-daemon-sbgjb" Feb 16 19:43:39 crc kubenswrapper[4675]: E0216 19:43:39.187484 4675 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 16 19:43:39 crc kubenswrapper[4675]: E0216 19:43:39.187597 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8d5a5b47-38a4-4f7e-b40e-dba4825e18be-metrics-certs podName:8d5a5b47-38a4-4f7e-b40e-dba4825e18be nodeName:}" failed. No retries permitted until 2026-02-16 19:44:43.187573173 +0000 UTC m=+166.312862729 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8d5a5b47-38a4-4f7e-b40e-dba4825e18be-metrics-certs") pod "network-metrics-daemon-sbgjb" (UID: "8d5a5b47-38a4-4f7e-b40e-dba4825e18be") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 16 19:43:40 crc kubenswrapper[4675]: I0216 19:43:40.884135 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 19:43:40 crc kubenswrapper[4675]: I0216 19:43:40.884134 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 19:43:40 crc kubenswrapper[4675]: E0216 19:43:40.884712 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 19:43:40 crc kubenswrapper[4675]: I0216 19:43:40.884203 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 19:43:40 crc kubenswrapper[4675]: I0216 19:43:40.884206 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbgjb" Feb 16 19:43:40 crc kubenswrapper[4675]: E0216 19:43:40.884862 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 19:43:40 crc kubenswrapper[4675]: E0216 19:43:40.884979 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 19:43:40 crc kubenswrapper[4675]: E0216 19:43:40.885043 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbgjb" podUID="8d5a5b47-38a4-4f7e-b40e-dba4825e18be" Feb 16 19:43:42 crc kubenswrapper[4675]: I0216 19:43:42.883664 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 19:43:42 crc kubenswrapper[4675]: I0216 19:43:42.883771 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 19:43:42 crc kubenswrapper[4675]: I0216 19:43:42.883664 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 19:43:42 crc kubenswrapper[4675]: I0216 19:43:42.884591 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbgjb" Feb 16 19:43:42 crc kubenswrapper[4675]: E0216 19:43:42.884727 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 19:43:42 crc kubenswrapper[4675]: E0216 19:43:42.884993 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbgjb" podUID="8d5a5b47-38a4-4f7e-b40e-dba4825e18be" Feb 16 19:43:42 crc kubenswrapper[4675]: E0216 19:43:42.885167 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 19:43:42 crc kubenswrapper[4675]: E0216 19:43:42.885272 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 19:43:44 crc kubenswrapper[4675]: I0216 19:43:44.883878 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 19:43:44 crc kubenswrapper[4675]: I0216 19:43:44.883968 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 19:43:44 crc kubenswrapper[4675]: I0216 19:43:44.884028 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 19:43:44 crc kubenswrapper[4675]: E0216 19:43:44.884099 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 19:43:44 crc kubenswrapper[4675]: I0216 19:43:44.884151 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbgjb" Feb 16 19:43:44 crc kubenswrapper[4675]: E0216 19:43:44.884245 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 19:43:44 crc kubenswrapper[4675]: E0216 19:43:44.884390 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 19:43:44 crc kubenswrapper[4675]: E0216 19:43:44.884597 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbgjb" podUID="8d5a5b47-38a4-4f7e-b40e-dba4825e18be" Feb 16 19:43:46 crc kubenswrapper[4675]: I0216 19:43:46.884657 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbgjb" Feb 16 19:43:46 crc kubenswrapper[4675]: I0216 19:43:46.884848 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 19:43:46 crc kubenswrapper[4675]: I0216 19:43:46.884857 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 19:43:46 crc kubenswrapper[4675]: E0216 19:43:46.885027 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbgjb" podUID="8d5a5b47-38a4-4f7e-b40e-dba4825e18be" Feb 16 19:43:46 crc kubenswrapper[4675]: I0216 19:43:46.885211 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 19:43:46 crc kubenswrapper[4675]: E0216 19:43:46.885459 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 19:43:46 crc kubenswrapper[4675]: E0216 19:43:46.885514 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 19:43:46 crc kubenswrapper[4675]: E0216 19:43:46.885824 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 19:43:48 crc kubenswrapper[4675]: I0216 19:43:48.884106 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbgjb" Feb 16 19:43:48 crc kubenswrapper[4675]: E0216 19:43:48.884337 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbgjb" podUID="8d5a5b47-38a4-4f7e-b40e-dba4825e18be" Feb 16 19:43:48 crc kubenswrapper[4675]: I0216 19:43:48.884442 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 19:43:48 crc kubenswrapper[4675]: E0216 19:43:48.884560 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 19:43:48 crc kubenswrapper[4675]: I0216 19:43:48.885032 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 19:43:48 crc kubenswrapper[4675]: E0216 19:43:48.885125 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 19:43:48 crc kubenswrapper[4675]: I0216 19:43:48.886018 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 19:43:48 crc kubenswrapper[4675]: E0216 19:43:48.886394 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 19:43:49 crc kubenswrapper[4675]: I0216 19:43:49.885601 4675 scope.go:117] "RemoveContainer" containerID="07781eeef14d0ee02a533c5f1b5b7bdb9fa735d2a02636e9671605e69e6a932c" Feb 16 19:43:49 crc kubenswrapper[4675]: E0216 19:43:49.885983 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-gpc5z_openshift-ovn-kubernetes(9b6e2d5a-0472-425b-b5b4-0b94f14ebfba)\"" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" podUID="9b6e2d5a-0472-425b-b5b4-0b94f14ebfba" Feb 16 19:43:50 crc kubenswrapper[4675]: I0216 19:43:50.883756 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbgjb" Feb 16 19:43:50 crc kubenswrapper[4675]: I0216 19:43:50.883882 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 19:43:50 crc kubenswrapper[4675]: I0216 19:43:50.883882 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 19:43:50 crc kubenswrapper[4675]: E0216 19:43:50.884068 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbgjb" podUID="8d5a5b47-38a4-4f7e-b40e-dba4825e18be" Feb 16 19:43:50 crc kubenswrapper[4675]: I0216 19:43:50.884122 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 19:43:50 crc kubenswrapper[4675]: E0216 19:43:50.884127 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 19:43:50 crc kubenswrapper[4675]: E0216 19:43:50.884173 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 19:43:50 crc kubenswrapper[4675]: E0216 19:43:50.884241 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 19:43:52 crc kubenswrapper[4675]: I0216 19:43:52.883272 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 19:43:52 crc kubenswrapper[4675]: I0216 19:43:52.883372 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 19:43:52 crc kubenswrapper[4675]: E0216 19:43:52.883416 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 19:43:52 crc kubenswrapper[4675]: I0216 19:43:52.883444 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 19:43:52 crc kubenswrapper[4675]: I0216 19:43:52.883468 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbgjb" Feb 16 19:43:52 crc kubenswrapper[4675]: E0216 19:43:52.883636 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 19:43:52 crc kubenswrapper[4675]: E0216 19:43:52.883705 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 19:43:52 crc kubenswrapper[4675]: E0216 19:43:52.883787 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbgjb" podUID="8d5a5b47-38a4-4f7e-b40e-dba4825e18be" Feb 16 19:43:54 crc kubenswrapper[4675]: I0216 19:43:54.883510 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 19:43:54 crc kubenswrapper[4675]: I0216 19:43:54.883628 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 19:43:54 crc kubenswrapper[4675]: I0216 19:43:54.883510 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbgjb" Feb 16 19:43:54 crc kubenswrapper[4675]: E0216 19:43:54.883729 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 19:43:54 crc kubenswrapper[4675]: E0216 19:43:54.883787 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 19:43:54 crc kubenswrapper[4675]: E0216 19:43:54.883963 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbgjb" podUID="8d5a5b47-38a4-4f7e-b40e-dba4825e18be" Feb 16 19:43:54 crc kubenswrapper[4675]: I0216 19:43:54.884529 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 19:43:54 crc kubenswrapper[4675]: E0216 19:43:54.884677 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 19:43:56 crc kubenswrapper[4675]: I0216 19:43:56.883611 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 19:43:56 crc kubenswrapper[4675]: I0216 19:43:56.883840 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbgjb" Feb 16 19:43:56 crc kubenswrapper[4675]: I0216 19:43:56.883659 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 19:43:56 crc kubenswrapper[4675]: E0216 19:43:56.883992 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 19:43:56 crc kubenswrapper[4675]: E0216 19:43:56.884212 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 19:43:56 crc kubenswrapper[4675]: E0216 19:43:56.884251 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbgjb" podUID="8d5a5b47-38a4-4f7e-b40e-dba4825e18be" Feb 16 19:43:56 crc kubenswrapper[4675]: I0216 19:43:56.884929 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 19:43:56 crc kubenswrapper[4675]: E0216 19:43:56.885254 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 19:43:57 crc kubenswrapper[4675]: E0216 19:43:57.828492 4675 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Feb 16 19:43:57 crc kubenswrapper[4675]: E0216 19:43:57.973501 4675 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 16 19:43:58 crc kubenswrapper[4675]: I0216 19:43:58.589362 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pj5xg_c9a99563-d631-455f-8464-160e5619c610/kube-multus/1.log" Feb 16 19:43:58 crc kubenswrapper[4675]: I0216 19:43:58.589993 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pj5xg_c9a99563-d631-455f-8464-160e5619c610/kube-multus/0.log" Feb 16 19:43:58 crc kubenswrapper[4675]: I0216 19:43:58.590039 4675 generic.go:334] "Generic (PLEG): container finished" podID="c9a99563-d631-455f-8464-160e5619c610" containerID="4e43206e68596f3aac0d8c5d9c093d1b5aa57306577c54fc003ab051950ceeac" exitCode=1 Feb 16 19:43:58 crc kubenswrapper[4675]: I0216 19:43:58.590081 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pj5xg" event={"ID":"c9a99563-d631-455f-8464-160e5619c610","Type":"ContainerDied","Data":"4e43206e68596f3aac0d8c5d9c093d1b5aa57306577c54fc003ab051950ceeac"} Feb 16 19:43:58 crc kubenswrapper[4675]: I0216 19:43:58.590124 4675 scope.go:117] "RemoveContainer" containerID="798b2a0b186213c744675eb2bd1a33fcb05a8df10795373dbff1f91222fa17a9" Feb 16 19:43:58 crc kubenswrapper[4675]: I0216 19:43:58.590750 4675 scope.go:117] "RemoveContainer" containerID="4e43206e68596f3aac0d8c5d9c093d1b5aa57306577c54fc003ab051950ceeac" Feb 16 19:43:58 crc kubenswrapper[4675]: E0216 19:43:58.591097 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-pj5xg_openshift-multus(c9a99563-d631-455f-8464-160e5619c610)\"" pod="openshift-multus/multus-pj5xg" podUID="c9a99563-d631-455f-8464-160e5619c610" Feb 16 19:43:58 crc kubenswrapper[4675]: I0216 19:43:58.883975 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 19:43:58 crc kubenswrapper[4675]: I0216 19:43:58.884012 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbgjb" Feb 16 19:43:58 crc kubenswrapper[4675]: E0216 19:43:58.884154 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 19:43:58 crc kubenswrapper[4675]: I0216 19:43:58.884399 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 19:43:58 crc kubenswrapper[4675]: E0216 19:43:58.884489 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbgjb" podUID="8d5a5b47-38a4-4f7e-b40e-dba4825e18be" Feb 16 19:43:58 crc kubenswrapper[4675]: I0216 19:43:58.884604 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 19:43:58 crc kubenswrapper[4675]: E0216 19:43:58.884893 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 19:43:58 crc kubenswrapper[4675]: E0216 19:43:58.885036 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 19:43:59 crc kubenswrapper[4675]: I0216 19:43:59.596724 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pj5xg_c9a99563-d631-455f-8464-160e5619c610/kube-multus/1.log" Feb 16 19:44:00 crc kubenswrapper[4675]: I0216 19:44:00.883644 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbgjb" Feb 16 19:44:00 crc kubenswrapper[4675]: I0216 19:44:00.883775 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 19:44:00 crc kubenswrapper[4675]: E0216 19:44:00.883873 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbgjb" podUID="8d5a5b47-38a4-4f7e-b40e-dba4825e18be" Feb 16 19:44:00 crc kubenswrapper[4675]: I0216 19:44:00.883898 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 19:44:00 crc kubenswrapper[4675]: I0216 19:44:00.883975 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 19:44:00 crc kubenswrapper[4675]: E0216 19:44:00.884111 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 19:44:00 crc kubenswrapper[4675]: E0216 19:44:00.884188 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 19:44:00 crc kubenswrapper[4675]: E0216 19:44:00.884291 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 19:44:02 crc kubenswrapper[4675]: I0216 19:44:02.883221 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 19:44:02 crc kubenswrapper[4675]: I0216 19:44:02.883278 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 19:44:02 crc kubenswrapper[4675]: I0216 19:44:02.883251 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbgjb" Feb 16 19:44:02 crc kubenswrapper[4675]: E0216 19:44:02.883383 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 19:44:02 crc kubenswrapper[4675]: I0216 19:44:02.883236 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 19:44:02 crc kubenswrapper[4675]: E0216 19:44:02.883699 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 19:44:02 crc kubenswrapper[4675]: E0216 19:44:02.883838 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbgjb" podUID="8d5a5b47-38a4-4f7e-b40e-dba4825e18be" Feb 16 19:44:02 crc kubenswrapper[4675]: E0216 19:44:02.883876 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 19:44:02 crc kubenswrapper[4675]: E0216 19:44:02.975244 4675 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 16 19:44:04 crc kubenswrapper[4675]: I0216 19:44:04.883390 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbgjb" Feb 16 19:44:04 crc kubenswrapper[4675]: I0216 19:44:04.884542 4675 scope.go:117] "RemoveContainer" containerID="07781eeef14d0ee02a533c5f1b5b7bdb9fa735d2a02636e9671605e69e6a932c" Feb 16 19:44:04 crc kubenswrapper[4675]: I0216 19:44:04.883451 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 19:44:04 crc kubenswrapper[4675]: I0216 19:44:04.883600 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 19:44:04 crc kubenswrapper[4675]: I0216 19:44:04.883472 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 19:44:04 crc kubenswrapper[4675]: E0216 19:44:04.884898 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbgjb" podUID="8d5a5b47-38a4-4f7e-b40e-dba4825e18be" Feb 16 19:44:04 crc kubenswrapper[4675]: E0216 19:44:04.885022 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 19:44:04 crc kubenswrapper[4675]: E0216 19:44:04.885148 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 19:44:04 crc kubenswrapper[4675]: E0216 19:44:04.885293 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 19:44:05 crc kubenswrapper[4675]: I0216 19:44:05.623999 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gpc5z_9b6e2d5a-0472-425b-b5b4-0b94f14ebfba/ovnkube-controller/3.log" Feb 16 19:44:05 crc kubenswrapper[4675]: I0216 19:44:05.626472 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" event={"ID":"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba","Type":"ContainerStarted","Data":"73209f6cc6e17a7237c040c1bc086b40bb5945de0483d1a3082546f8d16c26b1"} Feb 16 19:44:05 crc kubenswrapper[4675]: I0216 19:44:05.627000 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" Feb 16 19:44:05 crc kubenswrapper[4675]: I0216 19:44:05.660448 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" podStartSLOduration=105.660425697 podStartE2EDuration="1m45.660425697s" podCreationTimestamp="2026-02-16 19:42:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 19:44:05.659618837 +0000 UTC m=+128.784908403" watchObservedRunningTime="2026-02-16 19:44:05.660425697 +0000 UTC m=+128.785715253" Feb 16 19:44:06 crc kubenswrapper[4675]: I0216 19:44:06.074998 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-sbgjb"] Feb 16 19:44:06 crc kubenswrapper[4675]: I0216 19:44:06.075182 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbgjb" Feb 16 19:44:06 crc kubenswrapper[4675]: E0216 19:44:06.075366 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbgjb" podUID="8d5a5b47-38a4-4f7e-b40e-dba4825e18be" Feb 16 19:44:06 crc kubenswrapper[4675]: I0216 19:44:06.883981 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 19:44:06 crc kubenswrapper[4675]: I0216 19:44:06.884056 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 19:44:06 crc kubenswrapper[4675]: E0216 19:44:06.884186 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 19:44:06 crc kubenswrapper[4675]: I0216 19:44:06.884224 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 19:44:06 crc kubenswrapper[4675]: E0216 19:44:06.884379 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 19:44:06 crc kubenswrapper[4675]: E0216 19:44:06.884481 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 19:44:07 crc kubenswrapper[4675]: I0216 19:44:07.883364 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbgjb" Feb 16 19:44:07 crc kubenswrapper[4675]: E0216 19:44:07.884436 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbgjb" podUID="8d5a5b47-38a4-4f7e-b40e-dba4825e18be" Feb 16 19:44:07 crc kubenswrapper[4675]: E0216 19:44:07.980818 4675 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 16 19:44:08 crc kubenswrapper[4675]: I0216 19:44:08.884276 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 19:44:08 crc kubenswrapper[4675]: I0216 19:44:08.884444 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 19:44:08 crc kubenswrapper[4675]: I0216 19:44:08.884584 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 19:44:08 crc kubenswrapper[4675]: I0216 19:44:08.884852 4675 scope.go:117] "RemoveContainer" containerID="4e43206e68596f3aac0d8c5d9c093d1b5aa57306577c54fc003ab051950ceeac" Feb 16 19:44:08 crc kubenswrapper[4675]: E0216 19:44:08.884863 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 19:44:08 crc kubenswrapper[4675]: E0216 19:44:08.885008 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 19:44:08 crc kubenswrapper[4675]: E0216 19:44:08.885193 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 19:44:09 crc kubenswrapper[4675]: I0216 19:44:09.654762 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pj5xg_c9a99563-d631-455f-8464-160e5619c610/kube-multus/1.log" Feb 16 19:44:09 crc kubenswrapper[4675]: I0216 19:44:09.654840 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pj5xg" event={"ID":"c9a99563-d631-455f-8464-160e5619c610","Type":"ContainerStarted","Data":"d0d4149182b358057857443c048c0c9d3c148645a2efd00ff51712f6b4d3fc01"} Feb 16 19:44:09 crc kubenswrapper[4675]: I0216 19:44:09.884458 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbgjb" Feb 16 19:44:09 crc kubenswrapper[4675]: E0216 19:44:09.884635 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbgjb" podUID="8d5a5b47-38a4-4f7e-b40e-dba4825e18be" Feb 16 19:44:10 crc kubenswrapper[4675]: I0216 19:44:10.884239 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 19:44:10 crc kubenswrapper[4675]: I0216 19:44:10.884311 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 19:44:10 crc kubenswrapper[4675]: I0216 19:44:10.884362 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 19:44:10 crc kubenswrapper[4675]: E0216 19:44:10.884441 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 19:44:10 crc kubenswrapper[4675]: E0216 19:44:10.884539 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 19:44:10 crc kubenswrapper[4675]: E0216 19:44:10.884665 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 19:44:11 crc kubenswrapper[4675]: I0216 19:44:11.884304 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbgjb" Feb 16 19:44:11 crc kubenswrapper[4675]: E0216 19:44:11.884513 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbgjb" podUID="8d5a5b47-38a4-4f7e-b40e-dba4825e18be" Feb 16 19:44:12 crc kubenswrapper[4675]: I0216 19:44:12.884064 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 19:44:12 crc kubenswrapper[4675]: I0216 19:44:12.884098 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 19:44:12 crc kubenswrapper[4675]: I0216 19:44:12.884193 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 19:44:12 crc kubenswrapper[4675]: E0216 19:44:12.884321 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 19:44:12 crc kubenswrapper[4675]: E0216 19:44:12.884512 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 19:44:12 crc kubenswrapper[4675]: E0216 19:44:12.884568 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 19:44:12 crc kubenswrapper[4675]: I0216 19:44:12.929683 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" Feb 16 19:44:13 crc kubenswrapper[4675]: I0216 19:44:13.884380 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbgjb" Feb 16 19:44:13 crc kubenswrapper[4675]: I0216 19:44:13.887856 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 16 19:44:13 crc kubenswrapper[4675]: I0216 19:44:13.889317 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 16 19:44:14 crc kubenswrapper[4675]: I0216 19:44:14.883571 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 19:44:14 crc kubenswrapper[4675]: I0216 19:44:14.883605 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 19:44:14 crc kubenswrapper[4675]: I0216 19:44:14.883675 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 19:44:14 crc kubenswrapper[4675]: I0216 19:44:14.887198 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 16 19:44:14 crc kubenswrapper[4675]: I0216 19:44:14.888031 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 16 19:44:14 crc kubenswrapper[4675]: I0216 19:44:14.888146 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 16 19:44:14 crc kubenswrapper[4675]: I0216 19:44:14.889609 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.271050 4675 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.315553 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-qzcjb"] Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.316245 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-qzcjb" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.319298 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.319503 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.319782 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.319856 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-q894z"] Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.320291 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.320572 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.320580 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q894z" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.320765 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.320915 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-mgrxf"] Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.321418 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-mgrxf" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.323064 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-fj4lh"] Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.323620 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-fj4lh" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.324171 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-lj8jk"] Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.324652 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-lj8jk" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.324966 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7kckm"] Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.326021 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-dgnq8"] Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.341204 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7kckm" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.346354 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-dttlz"] Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.348566 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-dgnq8" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.348661 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dttlz" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.359333 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.359371 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.359802 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 16 19:44:18 crc kubenswrapper[4675]: W0216 19:44:18.360080 4675 reflector.go:561] object-"openshift-machine-api"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-machine-api": no relationship found between node 'crc' and this object Feb 16 19:44:18 crc kubenswrapper[4675]: E0216 19:44:18.360137 4675 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-api\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-machine-api\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.360357 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.373094 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.376981 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.377161 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.377269 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.377831 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.378613 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.379838 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-6k7j5"] Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.380428 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5qzwb"] Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.380855 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5qzwb" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.381573 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-6k7j5" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.382216 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.382492 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.382647 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.382671 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.383768 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.383901 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.384240 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.385559 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-chc9w"] Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.385994 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.386288 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-chc9w" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.386438 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.386434 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.386548 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.386678 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.387312 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.387429 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.387856 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.388251 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.388349 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.388461 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.388466 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.388486 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.388608 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.388717 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.388771 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.388868 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.388881 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-sqwgl"] Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.389182 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.389371 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.389455 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-sqwgl" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.392493 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9kz76"] Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.392630 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.393126 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vpzz5"] Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.393347 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.393459 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-6rbbh"] Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.393797 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9kz76" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.393997 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6rbbh" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.394408 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vpzz5" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.395044 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.396501 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.399786 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.400068 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.400127 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.414723 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.415255 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.415564 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.415725 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.415838 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.415904 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.415959 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.415957 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.416078 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.416116 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.416867 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.417127 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.417236 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.417448 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.417581 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.417666 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.417804 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-s7kmk"] Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.418157 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.418308 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.418351 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.418499 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.418732 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.418815 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-s7kmk" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.418845 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.419348 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.428957 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-fg6bd"] Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.429138 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.434000 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fg6bd" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.434640 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ctjpc"] Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.437951 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-ctjpc" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.438001 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.438022 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.438133 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.438155 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.439417 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.440518 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.440525 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.453320 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.454470 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.455085 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.461783 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-ncsct"] Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.462469 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.462510 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.462814 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6rksb"] Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.462826 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.463007 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.463054 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ncsct" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.462873 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.462869 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.463368 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.463528 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.465944 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-846bd"] Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.466482 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-55l96"] Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.467155 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-846bd" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.469050 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-55l96" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.473621 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhhs7\" (UniqueName: \"kubernetes.io/projected/f9b2f6cf-9260-43ee-a1bd-3d04b6e0b666-kube-api-access-qhhs7\") pod \"cluster-image-registry-operator-dc59b4c8b-5qzwb\" (UID: \"f9b2f6cf-9260-43ee-a1bd-3d04b6e0b666\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5qzwb" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.473667 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfk4g\" (UniqueName: \"kubernetes.io/projected/13c31ee7-d6e5-4796-b0f9-22111647def3-kube-api-access-pfk4g\") pod \"cluster-samples-operator-665b6dd947-7kckm\" (UID: \"13c31ee7-d6e5-4796-b0f9-22111647def3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7kckm" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.473712 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10890572-a59e-434e-9de6-cdc91e7ffa50-serving-cert\") pod \"console-operator-58897d9998-sqwgl\" (UID: \"10890572-a59e-434e-9de6-cdc91e7ffa50\") " pod="openshift-console-operator/console-operator-58897d9998-sqwgl" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.473733 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8de617fc-9ec5-4fd7-81a3-d1c7621bf288-config\") pod \"machine-api-operator-5694c8668f-lj8jk\" (UID: \"8de617fc-9ec5-4fd7-81a3-d1c7621bf288\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lj8jk" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.473750 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42669ce8-99eb-46ab-ac1e-ef126adaad60-config\") pod \"etcd-operator-b45778765-mgrxf\" (UID: \"42669ce8-99eb-46ab-ac1e-ef126adaad60\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mgrxf" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.473765 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7dc12320-ba2f-4683-b3fd-ed8cb9ef07c4-serving-cert\") pod \"authentication-operator-69f744f599-dgnq8\" (UID: \"7dc12320-ba2f-4683-b3fd-ed8cb9ef07c4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dgnq8" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.473783 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xnd2\" (UniqueName: \"kubernetes.io/projected/7dc12320-ba2f-4683-b3fd-ed8cb9ef07c4-kube-api-access-5xnd2\") pod \"authentication-operator-69f744f599-dgnq8\" (UID: \"7dc12320-ba2f-4683-b3fd-ed8cb9ef07c4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dgnq8" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.473800 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9qgj\" (UniqueName: \"kubernetes.io/projected/5d5f11f1-66e4-4700-9c37-f3843d1769a5-kube-api-access-z9qgj\") pod \"downloads-7954f5f757-6k7j5\" (UID: \"5d5f11f1-66e4-4700-9c37-f3843d1769a5\") " pod="openshift-console/downloads-7954f5f757-6k7j5" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.473818 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d9de00f8-6995-42d6-ad28-1961096e55c0-client-ca\") pod \"controller-manager-879f6c89f-qzcjb\" (UID: \"d9de00f8-6995-42d6-ad28-1961096e55c0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-qzcjb" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.473838 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/061097a9-32e9-4972-989a-f3777a41bc2b-oauth-serving-cert\") pod \"console-f9d7485db-chc9w\" (UID: \"061097a9-32e9-4972-989a-f3777a41bc2b\") " pod="openshift-console/console-f9d7485db-chc9w" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.473859 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6rg4\" (UniqueName: \"kubernetes.io/projected/d9de00f8-6995-42d6-ad28-1961096e55c0-kube-api-access-n6rg4\") pod \"controller-manager-879f6c89f-qzcjb\" (UID: \"d9de00f8-6995-42d6-ad28-1961096e55c0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-qzcjb" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.473876 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnnlq\" (UniqueName: \"kubernetes.io/projected/f80b9c97-1cb5-427f-9b9f-4b62316d9e7a-kube-api-access-vnnlq\") pod \"route-controller-manager-6576b87f9c-q894z\" (UID: \"f80b9c97-1cb5-427f-9b9f-4b62316d9e7a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q894z" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.473895 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7dc12320-ba2f-4683-b3fd-ed8cb9ef07c4-config\") pod \"authentication-operator-69f744f599-dgnq8\" (UID: \"7dc12320-ba2f-4683-b3fd-ed8cb9ef07c4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dgnq8" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.473915 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f80b9c97-1cb5-427f-9b9f-4b62316d9e7a-config\") pod \"route-controller-manager-6576b87f9c-q894z\" (UID: \"f80b9c97-1cb5-427f-9b9f-4b62316d9e7a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q894z" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.473933 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b2792f84-b979-482e-812c-00eadcc75958-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-9kz76\" (UID: \"b2792f84-b979-482e-812c-00eadcc75958\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9kz76" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.473948 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fd12d696-66f6-4a7c-b6a4-75659b2c1a3b-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-vpzz5\" (UID: \"fd12d696-66f6-4a7c-b6a4-75659b2c1a3b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vpzz5" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.473963 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7dc12320-ba2f-4683-b3fd-ed8cb9ef07c4-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-dgnq8\" (UID: \"7dc12320-ba2f-4683-b3fd-ed8cb9ef07c4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dgnq8" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.473979 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f9b2f6cf-9260-43ee-a1bd-3d04b6e0b666-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-5qzwb\" (UID: \"f9b2f6cf-9260-43ee-a1bd-3d04b6e0b666\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5qzwb" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.473998 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f80b9c97-1cb5-427f-9b9f-4b62316d9e7a-serving-cert\") pod \"route-controller-manager-6576b87f9c-q894z\" (UID: \"f80b9c97-1cb5-427f-9b9f-4b62316d9e7a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q894z" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.474017 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/061097a9-32e9-4972-989a-f3777a41bc2b-service-ca\") pod \"console-f9d7485db-chc9w\" (UID: \"061097a9-32e9-4972-989a-f3777a41bc2b\") " pod="openshift-console/console-f9d7485db-chc9w" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.474031 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9de00f8-6995-42d6-ad28-1961096e55c0-config\") pod \"controller-manager-879f6c89f-qzcjb\" (UID: \"d9de00f8-6995-42d6-ad28-1961096e55c0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-qzcjb" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.474056 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2792f84-b979-482e-812c-00eadcc75958-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-9kz76\" (UID: \"b2792f84-b979-482e-812c-00eadcc75958\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9kz76" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.474071 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9wlg\" (UniqueName: \"kubernetes.io/projected/b2792f84-b979-482e-812c-00eadcc75958-kube-api-access-v9wlg\") pod \"openshift-controller-manager-operator-756b6f6bc6-9kz76\" (UID: \"b2792f84-b979-482e-812c-00eadcc75958\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9kz76" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.474089 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/42669ce8-99eb-46ab-ac1e-ef126adaad60-etcd-client\") pod \"etcd-operator-b45778765-mgrxf\" (UID: \"42669ce8-99eb-46ab-ac1e-ef126adaad60\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mgrxf" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.474103 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/061097a9-32e9-4972-989a-f3777a41bc2b-console-serving-cert\") pod \"console-f9d7485db-chc9w\" (UID: \"061097a9-32e9-4972-989a-f3777a41bc2b\") " pod="openshift-console/console-f9d7485db-chc9w" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.474125 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/061097a9-32e9-4972-989a-f3777a41bc2b-console-oauth-config\") pod \"console-f9d7485db-chc9w\" (UID: \"061097a9-32e9-4972-989a-f3777a41bc2b\") " pod="openshift-console/console-f9d7485db-chc9w" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.474142 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/8de617fc-9ec5-4fd7-81a3-d1c7621bf288-images\") pod \"machine-api-operator-5694c8668f-lj8jk\" (UID: \"8de617fc-9ec5-4fd7-81a3-d1c7621bf288\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lj8jk" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.474158 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d9de00f8-6995-42d6-ad28-1961096e55c0-serving-cert\") pod \"controller-manager-879f6c89f-qzcjb\" (UID: \"d9de00f8-6995-42d6-ad28-1961096e55c0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-qzcjb" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.474179 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/42669ce8-99eb-46ab-ac1e-ef126adaad60-etcd-service-ca\") pod \"etcd-operator-b45778765-mgrxf\" (UID: \"42669ce8-99eb-46ab-ac1e-ef126adaad60\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mgrxf" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.474197 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10890572-a59e-434e-9de6-cdc91e7ffa50-config\") pod \"console-operator-58897d9998-sqwgl\" (UID: \"10890572-a59e-434e-9de6-cdc91e7ffa50\") " pod="openshift-console-operator/console-operator-58897d9998-sqwgl" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.474222 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4178aeb7-6531-47e8-bf0d-695fbb18bc89-serving-cert\") pod \"openshift-config-operator-7777fb866f-dttlz\" (UID: \"4178aeb7-6531-47e8-bf0d-695fbb18bc89\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dttlz" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.474238 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd12d696-66f6-4a7c-b6a4-75659b2c1a3b-config\") pod \"kube-apiserver-operator-766d6c64bb-vpzz5\" (UID: \"fd12d696-66f6-4a7c-b6a4-75659b2c1a3b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vpzz5" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.474253 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/4178aeb7-6531-47e8-bf0d-695fbb18bc89-available-featuregates\") pod \"openshift-config-operator-7777fb866f-dttlz\" (UID: \"4178aeb7-6531-47e8-bf0d-695fbb18bc89\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dttlz" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.474272 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cec4d170-3bfa-482c-9dea-06ada6534e6c-metrics-tls\") pod \"dns-operator-744455d44c-fj4lh\" (UID: \"cec4d170-3bfa-482c-9dea-06ada6534e6c\") " pod="openshift-dns-operator/dns-operator-744455d44c-fj4lh" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.474287 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xxjh\" (UniqueName: \"kubernetes.io/projected/061097a9-32e9-4972-989a-f3777a41bc2b-kube-api-access-4xxjh\") pod \"console-f9d7485db-chc9w\" (UID: \"061097a9-32e9-4972-989a-f3777a41bc2b\") " pod="openshift-console/console-f9d7485db-chc9w" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.474302 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/f9b2f6cf-9260-43ee-a1bd-3d04b6e0b666-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-5qzwb\" (UID: \"f9b2f6cf-9260-43ee-a1bd-3d04b6e0b666\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5qzwb" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.474318 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f9b2f6cf-9260-43ee-a1bd-3d04b6e0b666-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-5qzwb\" (UID: \"f9b2f6cf-9260-43ee-a1bd-3d04b6e0b666\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5qzwb" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.474334 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2dsg\" (UniqueName: \"kubernetes.io/projected/cec4d170-3bfa-482c-9dea-06ada6534e6c-kube-api-access-b2dsg\") pod \"dns-operator-744455d44c-fj4lh\" (UID: \"cec4d170-3bfa-482c-9dea-06ada6534e6c\") " pod="openshift-dns-operator/dns-operator-744455d44c-fj4lh" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.474349 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/8de617fc-9ec5-4fd7-81a3-d1c7621bf288-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-lj8jk\" (UID: \"8de617fc-9ec5-4fd7-81a3-d1c7621bf288\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lj8jk" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.474364 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p48bp\" (UniqueName: \"kubernetes.io/projected/8de617fc-9ec5-4fd7-81a3-d1c7621bf288-kube-api-access-p48bp\") pod \"machine-api-operator-5694c8668f-lj8jk\" (UID: \"8de617fc-9ec5-4fd7-81a3-d1c7621bf288\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lj8jk" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.474379 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d42zc\" (UniqueName: \"kubernetes.io/projected/4178aeb7-6531-47e8-bf0d-695fbb18bc89-kube-api-access-d42zc\") pod \"openshift-config-operator-7777fb866f-dttlz\" (UID: \"4178aeb7-6531-47e8-bf0d-695fbb18bc89\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dttlz" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.474394 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9f97b\" (UniqueName: \"kubernetes.io/projected/635463ec-8d07-4e48-973a-f219b530f144-kube-api-access-9f97b\") pod \"migrator-59844c95c7-6rbbh\" (UID: \"635463ec-8d07-4e48-973a-f219b530f144\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6rbbh" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.474411 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjm2v\" (UniqueName: \"kubernetes.io/projected/10890572-a59e-434e-9de6-cdc91e7ffa50-kube-api-access-pjm2v\") pod \"console-operator-58897d9998-sqwgl\" (UID: \"10890572-a59e-434e-9de6-cdc91e7ffa50\") " pod="openshift-console-operator/console-operator-58897d9998-sqwgl" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.474428 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7dc12320-ba2f-4683-b3fd-ed8cb9ef07c4-service-ca-bundle\") pod \"authentication-operator-69f744f599-dgnq8\" (UID: \"7dc12320-ba2f-4683-b3fd-ed8cb9ef07c4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dgnq8" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.474444 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f80b9c97-1cb5-427f-9b9f-4b62316d9e7a-client-ca\") pod \"route-controller-manager-6576b87f9c-q894z\" (UID: \"f80b9c97-1cb5-427f-9b9f-4b62316d9e7a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q894z" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.474501 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d9de00f8-6995-42d6-ad28-1961096e55c0-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-qzcjb\" (UID: \"d9de00f8-6995-42d6-ad28-1961096e55c0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-qzcjb" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.474519 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/10890572-a59e-434e-9de6-cdc91e7ffa50-trusted-ca\") pod \"console-operator-58897d9998-sqwgl\" (UID: \"10890572-a59e-434e-9de6-cdc91e7ffa50\") " pod="openshift-console-operator/console-operator-58897d9998-sqwgl" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.474534 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd12d696-66f6-4a7c-b6a4-75659b2c1a3b-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-vpzz5\" (UID: \"fd12d696-66f6-4a7c-b6a4-75659b2c1a3b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vpzz5" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.474558 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/42669ce8-99eb-46ab-ac1e-ef126adaad60-serving-cert\") pod \"etcd-operator-b45778765-mgrxf\" (UID: \"42669ce8-99eb-46ab-ac1e-ef126adaad60\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mgrxf" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.474573 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2sg9k\" (UniqueName: \"kubernetes.io/projected/42669ce8-99eb-46ab-ac1e-ef126adaad60-kube-api-access-2sg9k\") pod \"etcd-operator-b45778765-mgrxf\" (UID: \"42669ce8-99eb-46ab-ac1e-ef126adaad60\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mgrxf" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.474590 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/061097a9-32e9-4972-989a-f3777a41bc2b-trusted-ca-bundle\") pod \"console-f9d7485db-chc9w\" (UID: \"061097a9-32e9-4972-989a-f3777a41bc2b\") " pod="openshift-console/console-f9d7485db-chc9w" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.474608 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/13c31ee7-d6e5-4796-b0f9-22111647def3-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-7kckm\" (UID: \"13c31ee7-d6e5-4796-b0f9-22111647def3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7kckm" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.474624 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/42669ce8-99eb-46ab-ac1e-ef126adaad60-etcd-ca\") pod \"etcd-operator-b45778765-mgrxf\" (UID: \"42669ce8-99eb-46ab-ac1e-ef126adaad60\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mgrxf" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.474638 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/061097a9-32e9-4972-989a-f3777a41bc2b-console-config\") pod \"console-f9d7485db-chc9w\" (UID: \"061097a9-32e9-4972-989a-f3777a41bc2b\") " pod="openshift-console/console-f9d7485db-chc9w" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.476125 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.476373 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.476495 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.476729 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.479060 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.479326 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hksfk"] Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.479482 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.479929 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.480082 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-hq42n"] Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.480106 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.480253 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.480715 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hq42n" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.481033 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hksfk" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.481281 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.483774 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6rksb" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.484956 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.485896 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.488147 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.489528 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-g82z9"] Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.490275 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-g82z9" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.490644 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4l4lr"] Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.491457 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4l4lr" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.492675 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-ggdkr"] Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.493450 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ggdkr" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.498216 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.503992 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-hrp82"] Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.508143 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-86tqb"] Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.508454 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-hrp82" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.532837 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dvv45"] Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.536641 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-jzrxx"] Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.539180 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6kj4n"] Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.541158 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-86tqb" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.541597 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dvv45" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.542274 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-p2j88"] Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.542640 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jzrxx" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.546377 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-6kj4n" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.554219 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-lsjl6"] Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.558763 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-p2j88" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.567293 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.568106 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.571789 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.572615 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-n5h4j"] Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.573818 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-6sj8v"] Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.574659 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-6sj8v" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.575913 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-lsjl6" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.576765 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-n5h4j" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.578165 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/42669ce8-99eb-46ab-ac1e-ef126adaad60-etcd-service-ca\") pod \"etcd-operator-b45778765-mgrxf\" (UID: \"42669ce8-99eb-46ab-ac1e-ef126adaad60\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mgrxf" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.578222 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10890572-a59e-434e-9de6-cdc91e7ffa50-config\") pod \"console-operator-58897d9998-sqwgl\" (UID: \"10890572-a59e-434e-9de6-cdc91e7ffa50\") " pod="openshift-console-operator/console-operator-58897d9998-sqwgl" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.578254 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4178aeb7-6531-47e8-bf0d-695fbb18bc89-serving-cert\") pod \"openshift-config-operator-7777fb866f-dttlz\" (UID: \"4178aeb7-6531-47e8-bf0d-695fbb18bc89\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dttlz" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.578276 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d9de00f8-6995-42d6-ad28-1961096e55c0-serving-cert\") pod \"controller-manager-879f6c89f-qzcjb\" (UID: \"d9de00f8-6995-42d6-ad28-1961096e55c0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-qzcjb" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.578307 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/76908270-f3ef-412b-a18d-065cf006461e-image-import-ca\") pod \"apiserver-76f77b778f-s7kmk\" (UID: \"76908270-f3ef-412b-a18d-065cf006461e\") " pod="openshift-apiserver/apiserver-76f77b778f-s7kmk" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.578336 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e92c9592-4e6c-4589-809d-18de894b6352-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-4l4lr\" (UID: \"e92c9592-4e6c-4589-809d-18de894b6352\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4l4lr" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.578357 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd12d696-66f6-4a7c-b6a4-75659b2c1a3b-config\") pod \"kube-apiserver-operator-766d6c64bb-vpzz5\" (UID: \"fd12d696-66f6-4a7c-b6a4-75659b2c1a3b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vpzz5" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.578376 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/76908270-f3ef-412b-a18d-065cf006461e-etcd-client\") pod \"apiserver-76f77b778f-s7kmk\" (UID: \"76908270-f3ef-412b-a18d-065cf006461e\") " pod="openshift-apiserver/apiserver-76f77b778f-s7kmk" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.578403 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/4178aeb7-6531-47e8-bf0d-695fbb18bc89-available-featuregates\") pod \"openshift-config-operator-7777fb866f-dttlz\" (UID: \"4178aeb7-6531-47e8-bf0d-695fbb18bc89\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dttlz" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.578426 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9ed87087-af1e-4042-b3ba-000095c2b183-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-ggdkr\" (UID: \"9ed87087-af1e-4042-b3ba-000095c2b183\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ggdkr" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.578451 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1b4bbcb2-7564-4018-9d41-2ecd0cc7a80f-proxy-tls\") pod \"machine-config-operator-74547568cd-hq42n\" (UID: \"1b4bbcb2-7564-4018-9d41-2ecd0cc7a80f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hq42n" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.578471 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8c66cb83-e800-414f-a637-18fd6c6423e5-metrics-tls\") pod \"ingress-operator-5b745b69d9-ncsct\" (UID: \"8c66cb83-e800-414f-a637-18fd6c6423e5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ncsct" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.578494 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da37a076-9ec6-4649-950a-db70d7802daf-config\") pod \"openshift-apiserver-operator-796bbdcf4f-g82z9\" (UID: \"da37a076-9ec6-4649-950a-db70d7802daf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-g82z9" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.578515 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/82473df6-10c6-4c55-9eb0-c0a98830ff79-serving-cert\") pod \"apiserver-7bbb656c7d-fg6bd\" (UID: \"82473df6-10c6-4c55-9eb0-c0a98830ff79\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fg6bd" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.578537 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/76908270-f3ef-412b-a18d-065cf006461e-audit\") pod \"apiserver-76f77b778f-s7kmk\" (UID: \"76908270-f3ef-412b-a18d-065cf006461e\") " pod="openshift-apiserver/apiserver-76f77b778f-s7kmk" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.578556 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cec4d170-3bfa-482c-9dea-06ada6534e6c-metrics-tls\") pod \"dns-operator-744455d44c-fj4lh\" (UID: \"cec4d170-3bfa-482c-9dea-06ada6534e6c\") " pod="openshift-dns-operator/dns-operator-744455d44c-fj4lh" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.578578 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xxjh\" (UniqueName: \"kubernetes.io/projected/061097a9-32e9-4972-989a-f3777a41bc2b-kube-api-access-4xxjh\") pod \"console-f9d7485db-chc9w\" (UID: \"061097a9-32e9-4972-989a-f3777a41bc2b\") " pod="openshift-console/console-f9d7485db-chc9w" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.578608 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/f9b2f6cf-9260-43ee-a1bd-3d04b6e0b666-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-5qzwb\" (UID: \"f9b2f6cf-9260-43ee-a1bd-3d04b6e0b666\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5qzwb" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.578634 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f9b2f6cf-9260-43ee-a1bd-3d04b6e0b666-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-5qzwb\" (UID: \"f9b2f6cf-9260-43ee-a1bd-3d04b6e0b666\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5qzwb" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.578655 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9ed87087-af1e-4042-b3ba-000095c2b183-proxy-tls\") pod \"machine-config-controller-84d6567774-ggdkr\" (UID: \"9ed87087-af1e-4042-b3ba-000095c2b183\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ggdkr" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.578673 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da37a076-9ec6-4649-950a-db70d7802daf-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-g82z9\" (UID: \"da37a076-9ec6-4649-950a-db70d7802daf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-g82z9" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.578710 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2dsg\" (UniqueName: \"kubernetes.io/projected/cec4d170-3bfa-482c-9dea-06ada6534e6c-kube-api-access-b2dsg\") pod \"dns-operator-744455d44c-fj4lh\" (UID: \"cec4d170-3bfa-482c-9dea-06ada6534e6c\") " pod="openshift-dns-operator/dns-operator-744455d44c-fj4lh" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.578734 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/8de617fc-9ec5-4fd7-81a3-d1c7621bf288-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-lj8jk\" (UID: \"8de617fc-9ec5-4fd7-81a3-d1c7621bf288\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lj8jk" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.578763 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p48bp\" (UniqueName: \"kubernetes.io/projected/8de617fc-9ec5-4fd7-81a3-d1c7621bf288-kube-api-access-p48bp\") pod \"machine-api-operator-5694c8668f-lj8jk\" (UID: \"8de617fc-9ec5-4fd7-81a3-d1c7621bf288\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lj8jk" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.578784 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d42zc\" (UniqueName: \"kubernetes.io/projected/4178aeb7-6531-47e8-bf0d-695fbb18bc89-kube-api-access-d42zc\") pod \"openshift-config-operator-7777fb866f-dttlz\" (UID: \"4178aeb7-6531-47e8-bf0d-695fbb18bc89\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dttlz" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.578812 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b5a72330-c8e5-4be3-8083-ced47a7b6ada-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-55l96\" (UID: \"b5a72330-c8e5-4be3-8083-ced47a7b6ada\") " pod="openshift-authentication/oauth-openshift-558db77b4-55l96" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.578839 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76908270-f3ef-412b-a18d-065cf006461e-serving-cert\") pod \"apiserver-76f77b778f-s7kmk\" (UID: \"76908270-f3ef-412b-a18d-065cf006461e\") " pod="openshift-apiserver/apiserver-76f77b778f-s7kmk" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.578827 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29521170-6rzxb"] Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.578866 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjm2v\" (UniqueName: \"kubernetes.io/projected/10890572-a59e-434e-9de6-cdc91e7ffa50-kube-api-access-pjm2v\") pod \"console-operator-58897d9998-sqwgl\" (UID: \"10890572-a59e-434e-9de6-cdc91e7ffa50\") " pod="openshift-console-operator/console-operator-58897d9998-sqwgl" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.578887 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9f97b\" (UniqueName: \"kubernetes.io/projected/635463ec-8d07-4e48-973a-f219b530f144-kube-api-access-9f97b\") pod \"migrator-59844c95c7-6rbbh\" (UID: \"635463ec-8d07-4e48-973a-f219b530f144\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6rbbh" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.578911 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8c66cb83-e800-414f-a637-18fd6c6423e5-bound-sa-token\") pod \"ingress-operator-5b745b69d9-ncsct\" (UID: \"8c66cb83-e800-414f-a637-18fd6c6423e5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ncsct" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.578935 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/76908270-f3ef-412b-a18d-065cf006461e-encryption-config\") pod \"apiserver-76f77b778f-s7kmk\" (UID: \"76908270-f3ef-412b-a18d-065cf006461e\") " pod="openshift-apiserver/apiserver-76f77b778f-s7kmk" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.578963 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7dc12320-ba2f-4683-b3fd-ed8cb9ef07c4-service-ca-bundle\") pod \"authentication-operator-69f744f599-dgnq8\" (UID: \"7dc12320-ba2f-4683-b3fd-ed8cb9ef07c4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dgnq8" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.579003 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f80b9c97-1cb5-427f-9b9f-4b62316d9e7a-client-ca\") pod \"route-controller-manager-6576b87f9c-q894z\" (UID: \"f80b9c97-1cb5-427f-9b9f-4b62316d9e7a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q894z" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.579056 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d9de00f8-6995-42d6-ad28-1961096e55c0-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-qzcjb\" (UID: \"d9de00f8-6995-42d6-ad28-1961096e55c0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-qzcjb" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.579092 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b5a72330-c8e5-4be3-8083-ced47a7b6ada-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-55l96\" (UID: \"b5a72330-c8e5-4be3-8083-ced47a7b6ada\") " pod="openshift-authentication/oauth-openshift-558db77b4-55l96" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.579132 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/10890572-a59e-434e-9de6-cdc91e7ffa50-trusted-ca\") pod \"console-operator-58897d9998-sqwgl\" (UID: \"10890572-a59e-434e-9de6-cdc91e7ffa50\") " pod="openshift-console-operator/console-operator-58897d9998-sqwgl" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.579177 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/76908270-f3ef-412b-a18d-065cf006461e-trusted-ca-bundle\") pod \"apiserver-76f77b778f-s7kmk\" (UID: \"76908270-f3ef-412b-a18d-065cf006461e\") " pod="openshift-apiserver/apiserver-76f77b778f-s7kmk" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.579224 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/42669ce8-99eb-46ab-ac1e-ef126adaad60-serving-cert\") pod \"etcd-operator-b45778765-mgrxf\" (UID: \"42669ce8-99eb-46ab-ac1e-ef126adaad60\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mgrxf" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.579254 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2sg9k\" (UniqueName: \"kubernetes.io/projected/42669ce8-99eb-46ab-ac1e-ef126adaad60-kube-api-access-2sg9k\") pod \"etcd-operator-b45778765-mgrxf\" (UID: \"42669ce8-99eb-46ab-ac1e-ef126adaad60\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mgrxf" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.579280 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/061097a9-32e9-4972-989a-f3777a41bc2b-trusted-ca-bundle\") pod \"console-f9d7485db-chc9w\" (UID: \"061097a9-32e9-4972-989a-f3777a41bc2b\") " pod="openshift-console/console-f9d7485db-chc9w" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.579299 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/13c31ee7-d6e5-4796-b0f9-22111647def3-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-7kckm\" (UID: \"13c31ee7-d6e5-4796-b0f9-22111647def3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7kckm" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.579327 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd12d696-66f6-4a7c-b6a4-75659b2c1a3b-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-vpzz5\" (UID: \"fd12d696-66f6-4a7c-b6a4-75659b2c1a3b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vpzz5" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.579350 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/82473df6-10c6-4c55-9eb0-c0a98830ff79-etcd-client\") pod \"apiserver-7bbb656c7d-fg6bd\" (UID: \"82473df6-10c6-4c55-9eb0-c0a98830ff79\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fg6bd" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.579374 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76908270-f3ef-412b-a18d-065cf006461e-config\") pod \"apiserver-76f77b778f-s7kmk\" (UID: \"76908270-f3ef-412b-a18d-065cf006461e\") " pod="openshift-apiserver/apiserver-76f77b778f-s7kmk" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.579398 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/42669ce8-99eb-46ab-ac1e-ef126adaad60-etcd-ca\") pod \"etcd-operator-b45778765-mgrxf\" (UID: \"42669ce8-99eb-46ab-ac1e-ef126adaad60\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mgrxf" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.579427 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/061097a9-32e9-4972-989a-f3777a41bc2b-console-config\") pod \"console-f9d7485db-chc9w\" (UID: \"061097a9-32e9-4972-989a-f3777a41bc2b\") " pod="openshift-console/console-f9d7485db-chc9w" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.579453 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b5a72330-c8e5-4be3-8083-ced47a7b6ada-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-55l96\" (UID: \"b5a72330-c8e5-4be3-8083-ced47a7b6ada\") " pod="openshift-authentication/oauth-openshift-558db77b4-55l96" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.579487 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kkpj\" (UniqueName: \"kubernetes.io/projected/b5a72330-c8e5-4be3-8083-ced47a7b6ada-kube-api-access-7kkpj\") pod \"oauth-openshift-558db77b4-55l96\" (UID: \"b5a72330-c8e5-4be3-8083-ced47a7b6ada\") " pod="openshift-authentication/oauth-openshift-558db77b4-55l96" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.579535 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdrgx\" (UniqueName: \"kubernetes.io/projected/1b4bbcb2-7564-4018-9d41-2ecd0cc7a80f-kube-api-access-wdrgx\") pod \"machine-config-operator-74547568cd-hq42n\" (UID: \"1b4bbcb2-7564-4018-9d41-2ecd0cc7a80f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hq42n" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.579562 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b5a72330-c8e5-4be3-8083-ced47a7b6ada-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-55l96\" (UID: \"b5a72330-c8e5-4be3-8083-ced47a7b6ada\") " pod="openshift-authentication/oauth-openshift-558db77b4-55l96" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.579587 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhhs7\" (UniqueName: \"kubernetes.io/projected/f9b2f6cf-9260-43ee-a1bd-3d04b6e0b666-kube-api-access-qhhs7\") pod \"cluster-image-registry-operator-dc59b4c8b-5qzwb\" (UID: \"f9b2f6cf-9260-43ee-a1bd-3d04b6e0b666\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5qzwb" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.579607 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/82473df6-10c6-4c55-9eb0-c0a98830ff79-audit-policies\") pod \"apiserver-7bbb656c7d-fg6bd\" (UID: \"82473df6-10c6-4c55-9eb0-c0a98830ff79\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fg6bd" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.579632 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/82473df6-10c6-4c55-9eb0-c0a98830ff79-audit-dir\") pod \"apiserver-7bbb656c7d-fg6bd\" (UID: \"82473df6-10c6-4c55-9eb0-c0a98830ff79\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fg6bd" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.579654 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b5a72330-c8e5-4be3-8083-ced47a7b6ada-audit-policies\") pod \"oauth-openshift-558db77b4-55l96\" (UID: \"b5a72330-c8e5-4be3-8083-ced47a7b6ada\") " pod="openshift-authentication/oauth-openshift-558db77b4-55l96" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.579731 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfk4g\" (UniqueName: \"kubernetes.io/projected/13c31ee7-d6e5-4796-b0f9-22111647def3-kube-api-access-pfk4g\") pod \"cluster-samples-operator-665b6dd947-7kckm\" (UID: \"13c31ee7-d6e5-4796-b0f9-22111647def3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7kckm" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.579758 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10890572-a59e-434e-9de6-cdc91e7ffa50-serving-cert\") pod \"console-operator-58897d9998-sqwgl\" (UID: \"10890572-a59e-434e-9de6-cdc91e7ffa50\") " pod="openshift-console-operator/console-operator-58897d9998-sqwgl" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.579787 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b5a72330-c8e5-4be3-8083-ced47a7b6ada-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-55l96\" (UID: \"b5a72330-c8e5-4be3-8083-ced47a7b6ada\") " pod="openshift-authentication/oauth-openshift-558db77b4-55l96" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.579814 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8de617fc-9ec5-4fd7-81a3-d1c7621bf288-config\") pod \"machine-api-operator-5694c8668f-lj8jk\" (UID: \"8de617fc-9ec5-4fd7-81a3-d1c7621bf288\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lj8jk" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.579837 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/62c9795a-67ff-43f2-a303-fdc91be1494d-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-hksfk\" (UID: \"62c9795a-67ff-43f2-a303-fdc91be1494d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hksfk" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.584272 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b5a72330-c8e5-4be3-8083-ced47a7b6ada-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-55l96\" (UID: \"b5a72330-c8e5-4be3-8083-ced47a7b6ada\") " pod="openshift-authentication/oauth-openshift-558db77b4-55l96" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.584344 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/76908270-f3ef-412b-a18d-065cf006461e-audit-dir\") pod \"apiserver-76f77b778f-s7kmk\" (UID: \"76908270-f3ef-412b-a18d-065cf006461e\") " pod="openshift-apiserver/apiserver-76f77b778f-s7kmk" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.584404 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42669ce8-99eb-46ab-ac1e-ef126adaad60-config\") pod \"etcd-operator-b45778765-mgrxf\" (UID: \"42669ce8-99eb-46ab-ac1e-ef126adaad60\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mgrxf" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.584429 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7dc12320-ba2f-4683-b3fd-ed8cb9ef07c4-serving-cert\") pod \"authentication-operator-69f744f599-dgnq8\" (UID: \"7dc12320-ba2f-4683-b3fd-ed8cb9ef07c4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dgnq8" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.584458 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xnd2\" (UniqueName: \"kubernetes.io/projected/7dc12320-ba2f-4683-b3fd-ed8cb9ef07c4-kube-api-access-5xnd2\") pod \"authentication-operator-69f744f599-dgnq8\" (UID: \"7dc12320-ba2f-4683-b3fd-ed8cb9ef07c4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dgnq8" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.584486 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/62c9795a-67ff-43f2-a303-fdc91be1494d-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-hksfk\" (UID: \"62c9795a-67ff-43f2-a303-fdc91be1494d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hksfk" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.584518 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/76908270-f3ef-412b-a18d-065cf006461e-etcd-serving-ca\") pod \"apiserver-76f77b778f-s7kmk\" (UID: \"76908270-f3ef-412b-a18d-065cf006461e\") " pod="openshift-apiserver/apiserver-76f77b778f-s7kmk" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.584546 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/82473df6-10c6-4c55-9eb0-c0a98830ff79-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-fg6bd\" (UID: \"82473df6-10c6-4c55-9eb0-c0a98830ff79\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fg6bd" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.584578 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9qgj\" (UniqueName: \"kubernetes.io/projected/5d5f11f1-66e4-4700-9c37-f3843d1769a5-kube-api-access-z9qgj\") pod \"downloads-7954f5f757-6k7j5\" (UID: \"5d5f11f1-66e4-4700-9c37-f3843d1769a5\") " pod="openshift-console/downloads-7954f5f757-6k7j5" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.584612 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d9de00f8-6995-42d6-ad28-1961096e55c0-client-ca\") pod \"controller-manager-879f6c89f-qzcjb\" (UID: \"d9de00f8-6995-42d6-ad28-1961096e55c0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-qzcjb" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.584649 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/061097a9-32e9-4972-989a-f3777a41bc2b-oauth-serving-cert\") pod \"console-f9d7485db-chc9w\" (UID: \"061097a9-32e9-4972-989a-f3777a41bc2b\") " pod="openshift-console/console-f9d7485db-chc9w" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.587444 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/4178aeb7-6531-47e8-bf0d-695fbb18bc89-available-featuregates\") pod \"openshift-config-operator-7777fb866f-dttlz\" (UID: \"4178aeb7-6531-47e8-bf0d-695fbb18bc89\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dttlz" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.580402 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29521170-6rzxb" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.580079 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/42669ce8-99eb-46ab-ac1e-ef126adaad60-etcd-service-ca\") pod \"etcd-operator-b45778765-mgrxf\" (UID: \"42669ce8-99eb-46ab-ac1e-ef126adaad60\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mgrxf" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.580920 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10890572-a59e-434e-9de6-cdc91e7ffa50-config\") pod \"console-operator-58897d9998-sqwgl\" (UID: \"10890572-a59e-434e-9de6-cdc91e7ffa50\") " pod="openshift-console-operator/console-operator-58897d9998-sqwgl" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.588979 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42669ce8-99eb-46ab-ac1e-ef126adaad60-config\") pod \"etcd-operator-b45778765-mgrxf\" (UID: \"42669ce8-99eb-46ab-ac1e-ef126adaad60\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mgrxf" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.589390 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4178aeb7-6531-47e8-bf0d-695fbb18bc89-serving-cert\") pod \"openshift-config-operator-7777fb866f-dttlz\" (UID: \"4178aeb7-6531-47e8-bf0d-695fbb18bc89\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dttlz" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.595848 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6rg4\" (UniqueName: \"kubernetes.io/projected/d9de00f8-6995-42d6-ad28-1961096e55c0-kube-api-access-n6rg4\") pod \"controller-manager-879f6c89f-qzcjb\" (UID: \"d9de00f8-6995-42d6-ad28-1961096e55c0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-qzcjb" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.596288 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqb9k\" (UniqueName: \"kubernetes.io/projected/e92c9592-4e6c-4589-809d-18de894b6352-kube-api-access-tqb9k\") pod \"kube-storage-version-migrator-operator-b67b599dd-4l4lr\" (UID: \"e92c9592-4e6c-4589-809d-18de894b6352\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4l4lr" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.596349 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1b4bbcb2-7564-4018-9d41-2ecd0cc7a80f-images\") pod \"machine-config-operator-74547568cd-hq42n\" (UID: \"1b4bbcb2-7564-4018-9d41-2ecd0cc7a80f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hq42n" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.596415 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2cd9132b-47d7-4a19-841b-cc409d55c5ae-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6rksb\" (UID: \"2cd9132b-47d7-4a19-841b-cc409d55c5ae\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6rksb" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.596462 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b5a72330-c8e5-4be3-8083-ced47a7b6ada-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-55l96\" (UID: \"b5a72330-c8e5-4be3-8083-ced47a7b6ada\") " pod="openshift-authentication/oauth-openshift-558db77b4-55l96" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.596488 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b5a72330-c8e5-4be3-8083-ced47a7b6ada-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-55l96\" (UID: \"b5a72330-c8e5-4be3-8083-ced47a7b6ada\") " pod="openshift-authentication/oauth-openshift-558db77b4-55l96" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.596525 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnnlq\" (UniqueName: \"kubernetes.io/projected/f80b9c97-1cb5-427f-9b9f-4b62316d9e7a-kube-api-access-vnnlq\") pod \"route-controller-manager-6576b87f9c-q894z\" (UID: \"f80b9c97-1cb5-427f-9b9f-4b62316d9e7a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q894z" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.596558 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qkzf\" (UniqueName: \"kubernetes.io/projected/8c66cb83-e800-414f-a637-18fd6c6423e5-kube-api-access-9qkzf\") pod \"ingress-operator-5b745b69d9-ncsct\" (UID: \"8c66cb83-e800-414f-a637-18fd6c6423e5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ncsct" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.596597 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7dc12320-ba2f-4683-b3fd-ed8cb9ef07c4-config\") pod \"authentication-operator-69f744f599-dgnq8\" (UID: \"7dc12320-ba2f-4683-b3fd-ed8cb9ef07c4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dgnq8" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.596623 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b5a72330-c8e5-4be3-8083-ced47a7b6ada-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-55l96\" (UID: \"b5a72330-c8e5-4be3-8083-ced47a7b6ada\") " pod="openshift-authentication/oauth-openshift-558db77b4-55l96" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.596649 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b5a72330-c8e5-4be3-8083-ced47a7b6ada-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-55l96\" (UID: \"b5a72330-c8e5-4be3-8083-ced47a7b6ada\") " pod="openshift-authentication/oauth-openshift-558db77b4-55l96" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.596705 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f80b9c97-1cb5-427f-9b9f-4b62316d9e7a-config\") pod \"route-controller-manager-6576b87f9c-q894z\" (UID: \"f80b9c97-1cb5-427f-9b9f-4b62316d9e7a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q894z" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.596739 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2cd9132b-47d7-4a19-841b-cc409d55c5ae-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6rksb\" (UID: \"2cd9132b-47d7-4a19-841b-cc409d55c5ae\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6rksb" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.596765 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8c66cb83-e800-414f-a637-18fd6c6423e5-trusted-ca\") pod \"ingress-operator-5b745b69d9-ncsct\" (UID: \"8c66cb83-e800-414f-a637-18fd6c6423e5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ncsct" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.598036 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/82473df6-10c6-4c55-9eb0-c0a98830ff79-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-fg6bd\" (UID: \"82473df6-10c6-4c55-9eb0-c0a98830ff79\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fg6bd" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.598144 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b2792f84-b979-482e-812c-00eadcc75958-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-9kz76\" (UID: \"b2792f84-b979-482e-812c-00eadcc75958\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9kz76" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.598170 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/061097a9-32e9-4972-989a-f3777a41bc2b-oauth-serving-cert\") pod \"console-f9d7485db-chc9w\" (UID: \"061097a9-32e9-4972-989a-f3777a41bc2b\") " pod="openshift-console/console-f9d7485db-chc9w" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.598179 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fd12d696-66f6-4a7c-b6a4-75659b2c1a3b-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-vpzz5\" (UID: \"fd12d696-66f6-4a7c-b6a4-75659b2c1a3b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vpzz5" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.598258 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7dc12320-ba2f-4683-b3fd-ed8cb9ef07c4-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-dgnq8\" (UID: \"7dc12320-ba2f-4683-b3fd-ed8cb9ef07c4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dgnq8" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.598306 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gn7g\" (UniqueName: \"kubernetes.io/projected/da37a076-9ec6-4649-950a-db70d7802daf-kube-api-access-4gn7g\") pod \"openshift-apiserver-operator-796bbdcf4f-g82z9\" (UID: \"da37a076-9ec6-4649-950a-db70d7802daf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-g82z9" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.598333 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktwfb\" (UniqueName: \"kubernetes.io/projected/82473df6-10c6-4c55-9eb0-c0a98830ff79-kube-api-access-ktwfb\") pod \"apiserver-7bbb656c7d-fg6bd\" (UID: \"82473df6-10c6-4c55-9eb0-c0a98830ff79\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fg6bd" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.598368 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f9b2f6cf-9260-43ee-a1bd-3d04b6e0b666-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-5qzwb\" (UID: \"f9b2f6cf-9260-43ee-a1bd-3d04b6e0b666\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5qzwb" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.598400 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b5a72330-c8e5-4be3-8083-ced47a7b6ada-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-55l96\" (UID: \"b5a72330-c8e5-4be3-8083-ced47a7b6ada\") " pod="openshift-authentication/oauth-openshift-558db77b4-55l96" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.598434 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmj4d\" (UniqueName: \"kubernetes.io/projected/76908270-f3ef-412b-a18d-065cf006461e-kube-api-access-rmj4d\") pod \"apiserver-76f77b778f-s7kmk\" (UID: \"76908270-f3ef-412b-a18d-065cf006461e\") " pod="openshift-apiserver/apiserver-76f77b778f-s7kmk" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.598469 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f80b9c97-1cb5-427f-9b9f-4b62316d9e7a-serving-cert\") pod \"route-controller-manager-6576b87f9c-q894z\" (UID: \"f80b9c97-1cb5-427f-9b9f-4b62316d9e7a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q894z" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.598496 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e92c9592-4e6c-4589-809d-18de894b6352-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-4l4lr\" (UID: \"e92c9592-4e6c-4589-809d-18de894b6352\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4l4lr" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.598519 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cd9132b-47d7-4a19-841b-cc409d55c5ae-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6rksb\" (UID: \"2cd9132b-47d7-4a19-841b-cc409d55c5ae\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6rksb" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.598553 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8wtv\" (UniqueName: \"kubernetes.io/projected/02491684-b05b-4878-b335-ba69ffbe08c9-kube-api-access-w8wtv\") pod \"control-plane-machine-set-operator-78cbb6b69f-846bd\" (UID: \"02491684-b05b-4878-b335-ba69ffbe08c9\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-846bd" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.598588 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1b4bbcb2-7564-4018-9d41-2ecd0cc7a80f-auth-proxy-config\") pod \"machine-config-operator-74547568cd-hq42n\" (UID: \"1b4bbcb2-7564-4018-9d41-2ecd0cc7a80f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hq42n" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.598611 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/061097a9-32e9-4972-989a-f3777a41bc2b-service-ca\") pod \"console-f9d7485db-chc9w\" (UID: \"061097a9-32e9-4972-989a-f3777a41bc2b\") " pod="openshift-console/console-f9d7485db-chc9w" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.598659 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2792f84-b979-482e-812c-00eadcc75958-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-9kz76\" (UID: \"b2792f84-b979-482e-812c-00eadcc75958\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9kz76" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.598705 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9wlg\" (UniqueName: \"kubernetes.io/projected/b2792f84-b979-482e-812c-00eadcc75958-kube-api-access-v9wlg\") pod \"openshift-controller-manager-operator-756b6f6bc6-9kz76\" (UID: \"b2792f84-b979-482e-812c-00eadcc75958\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9kz76" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.619832 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/42669ce8-99eb-46ab-ac1e-ef126adaad60-etcd-client\") pod \"etcd-operator-b45778765-mgrxf\" (UID: \"42669ce8-99eb-46ab-ac1e-ef126adaad60\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mgrxf" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.619882 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/061097a9-32e9-4972-989a-f3777a41bc2b-console-serving-cert\") pod \"console-f9d7485db-chc9w\" (UID: \"061097a9-32e9-4972-989a-f3777a41bc2b\") " pod="openshift-console/console-f9d7485db-chc9w" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.619912 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9de00f8-6995-42d6-ad28-1961096e55c0-config\") pod \"controller-manager-879f6c89f-qzcjb\" (UID: \"d9de00f8-6995-42d6-ad28-1961096e55c0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-qzcjb" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.619945 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62c9795a-67ff-43f2-a303-fdc91be1494d-config\") pod \"kube-controller-manager-operator-78b949d7b-hksfk\" (UID: \"62c9795a-67ff-43f2-a303-fdc91be1494d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hksfk" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.619983 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/76908270-f3ef-412b-a18d-065cf006461e-node-pullsecrets\") pod \"apiserver-76f77b778f-s7kmk\" (UID: \"76908270-f3ef-412b-a18d-065cf006461e\") " pod="openshift-apiserver/apiserver-76f77b778f-s7kmk" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.620016 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6mgn\" (UniqueName: \"kubernetes.io/projected/9ed87087-af1e-4042-b3ba-000095c2b183-kube-api-access-c6mgn\") pod \"machine-config-controller-84d6567774-ggdkr\" (UID: \"9ed87087-af1e-4042-b3ba-000095c2b183\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ggdkr" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.620047 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/061097a9-32e9-4972-989a-f3777a41bc2b-console-oauth-config\") pod \"console-f9d7485db-chc9w\" (UID: \"061097a9-32e9-4972-989a-f3777a41bc2b\") " pod="openshift-console/console-f9d7485db-chc9w" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.620068 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/8de617fc-9ec5-4fd7-81a3-d1c7621bf288-images\") pod \"machine-api-operator-5694c8668f-lj8jk\" (UID: \"8de617fc-9ec5-4fd7-81a3-d1c7621bf288\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lj8jk" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.620095 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/02491684-b05b-4878-b335-ba69ffbe08c9-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-846bd\" (UID: \"02491684-b05b-4878-b335-ba69ffbe08c9\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-846bd" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.620115 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/82473df6-10c6-4c55-9eb0-c0a98830ff79-encryption-config\") pod \"apiserver-7bbb656c7d-fg6bd\" (UID: \"82473df6-10c6-4c55-9eb0-c0a98830ff79\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fg6bd" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.620155 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b5a72330-c8e5-4be3-8083-ced47a7b6ada-audit-dir\") pod \"oauth-openshift-558db77b4-55l96\" (UID: \"b5a72330-c8e5-4be3-8083-ced47a7b6ada\") " pod="openshift-authentication/oauth-openshift-558db77b4-55l96" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.620250 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f9b2f6cf-9260-43ee-a1bd-3d04b6e0b666-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-5qzwb\" (UID: \"f9b2f6cf-9260-43ee-a1bd-3d04b6e0b666\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5qzwb" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.619562 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/061097a9-32e9-4972-989a-f3777a41bc2b-trusted-ca-bundle\") pod \"console-f9d7485db-chc9w\" (UID: \"061097a9-32e9-4972-989a-f3777a41bc2b\") " pod="openshift-console/console-f9d7485db-chc9w" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.600297 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd12d696-66f6-4a7c-b6a4-75659b2c1a3b-config\") pod \"kube-apiserver-operator-766d6c64bb-vpzz5\" (UID: \"fd12d696-66f6-4a7c-b6a4-75659b2c1a3b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vpzz5" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.604079 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7dc12320-ba2f-4683-b3fd-ed8cb9ef07c4-config\") pod \"authentication-operator-69f744f599-dgnq8\" (UID: \"7dc12320-ba2f-4683-b3fd-ed8cb9ef07c4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dgnq8" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.621124 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7dc12320-ba2f-4683-b3fd-ed8cb9ef07c4-serving-cert\") pod \"authentication-operator-69f744f599-dgnq8\" (UID: \"7dc12320-ba2f-4683-b3fd-ed8cb9ef07c4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dgnq8" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.621773 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f80b9c97-1cb5-427f-9b9f-4b62316d9e7a-serving-cert\") pod \"route-controller-manager-6576b87f9c-q894z\" (UID: \"f80b9c97-1cb5-427f-9b9f-4b62316d9e7a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q894z" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.609880 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2792f84-b979-482e-812c-00eadcc75958-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-9kz76\" (UID: \"b2792f84-b979-482e-812c-00eadcc75958\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9kz76" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.605066 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f80b9c97-1cb5-427f-9b9f-4b62316d9e7a-config\") pod \"route-controller-manager-6576b87f9c-q894z\" (UID: \"f80b9c97-1cb5-427f-9b9f-4b62316d9e7a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q894z" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.606718 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7dc12320-ba2f-4683-b3fd-ed8cb9ef07c4-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-dgnq8\" (UID: \"7dc12320-ba2f-4683-b3fd-ed8cb9ef07c4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dgnq8" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.609298 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/061097a9-32e9-4972-989a-f3777a41bc2b-service-ca\") pod \"console-f9d7485db-chc9w\" (UID: \"061097a9-32e9-4972-989a-f3777a41bc2b\") " pod="openshift-console/console-f9d7485db-chc9w" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.609392 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cec4d170-3bfa-482c-9dea-06ada6534e6c-metrics-tls\") pod \"dns-operator-744455d44c-fj4lh\" (UID: \"cec4d170-3bfa-482c-9dea-06ada6534e6c\") " pod="openshift-dns-operator/dns-operator-744455d44c-fj4lh" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.624964 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8de617fc-9ec5-4fd7-81a3-d1c7621bf288-config\") pod \"machine-api-operator-5694c8668f-lj8jk\" (UID: \"8de617fc-9ec5-4fd7-81a3-d1c7621bf288\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lj8jk" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.624963 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/061097a9-32e9-4972-989a-f3777a41bc2b-console-config\") pod \"console-f9d7485db-chc9w\" (UID: \"061097a9-32e9-4972-989a-f3777a41bc2b\") " pod="openshift-console/console-f9d7485db-chc9w" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.625571 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f80b9c97-1cb5-427f-9b9f-4b62316d9e7a-client-ca\") pod \"route-controller-manager-6576b87f9c-q894z\" (UID: \"f80b9c97-1cb5-427f-9b9f-4b62316d9e7a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q894z" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.625590 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9de00f8-6995-42d6-ad28-1961096e55c0-config\") pod \"controller-manager-879f6c89f-qzcjb\" (UID: \"d9de00f8-6995-42d6-ad28-1961096e55c0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-qzcjb" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.626214 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/8de617fc-9ec5-4fd7-81a3-d1c7621bf288-images\") pod \"machine-api-operator-5694c8668f-lj8jk\" (UID: \"8de617fc-9ec5-4fd7-81a3-d1c7621bf288\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lj8jk" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.610004 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d9de00f8-6995-42d6-ad28-1961096e55c0-serving-cert\") pod \"controller-manager-879f6c89f-qzcjb\" (UID: \"d9de00f8-6995-42d6-ad28-1961096e55c0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-qzcjb" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.627161 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/10890572-a59e-434e-9de6-cdc91e7ffa50-trusted-ca\") pod \"console-operator-58897d9998-sqwgl\" (UID: \"10890572-a59e-434e-9de6-cdc91e7ffa50\") " pod="openshift-console-operator/console-operator-58897d9998-sqwgl" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.610492 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7dc12320-ba2f-4683-b3fd-ed8cb9ef07c4-service-ca-bundle\") pod \"authentication-operator-69f744f599-dgnq8\" (UID: \"7dc12320-ba2f-4683-b3fd-ed8cb9ef07c4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dgnq8" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.610229 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d9de00f8-6995-42d6-ad28-1961096e55c0-client-ca\") pod \"controller-manager-879f6c89f-qzcjb\" (UID: \"d9de00f8-6995-42d6-ad28-1961096e55c0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-qzcjb" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.611103 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/42669ce8-99eb-46ab-ac1e-ef126adaad60-etcd-ca\") pod \"etcd-operator-b45778765-mgrxf\" (UID: \"42669ce8-99eb-46ab-ac1e-ef126adaad60\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mgrxf" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.628088 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/061097a9-32e9-4972-989a-f3777a41bc2b-console-oauth-config\") pod \"console-f9d7485db-chc9w\" (UID: \"061097a9-32e9-4972-989a-f3777a41bc2b\") " pod="openshift-console/console-f9d7485db-chc9w" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.628579 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/8de617fc-9ec5-4fd7-81a3-d1c7621bf288-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-lj8jk\" (UID: \"8de617fc-9ec5-4fd7-81a3-d1c7621bf288\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lj8jk" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.628800 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/13c31ee7-d6e5-4796-b0f9-22111647def3-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-7kckm\" (UID: \"13c31ee7-d6e5-4796-b0f9-22111647def3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7kckm" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.629397 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/f9b2f6cf-9260-43ee-a1bd-3d04b6e0b666-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-5qzwb\" (UID: \"f9b2f6cf-9260-43ee-a1bd-3d04b6e0b666\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5qzwb" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.629839 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10890572-a59e-434e-9de6-cdc91e7ffa50-serving-cert\") pod \"console-operator-58897d9998-sqwgl\" (UID: \"10890572-a59e-434e-9de6-cdc91e7ffa50\") " pod="openshift-console-operator/console-operator-58897d9998-sqwgl" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.631871 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/061097a9-32e9-4972-989a-f3777a41bc2b-console-serving-cert\") pod \"console-f9d7485db-chc9w\" (UID: \"061097a9-32e9-4972-989a-f3777a41bc2b\") " pod="openshift-console/console-f9d7485db-chc9w" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.633139 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd12d696-66f6-4a7c-b6a4-75659b2c1a3b-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-vpzz5\" (UID: \"fd12d696-66f6-4a7c-b6a4-75659b2c1a3b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vpzz5" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.634124 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d9de00f8-6995-42d6-ad28-1961096e55c0-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-qzcjb\" (UID: \"d9de00f8-6995-42d6-ad28-1961096e55c0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-qzcjb" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.634342 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.635292 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-mgrxf"] Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.646933 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-fj4lh"] Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.650443 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.650633 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b2792f84-b979-482e-812c-00eadcc75958-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-9kz76\" (UID: \"b2792f84-b979-482e-812c-00eadcc75958\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9kz76" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.650715 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/42669ce8-99eb-46ab-ac1e-ef126adaad60-serving-cert\") pod \"etcd-operator-b45778765-mgrxf\" (UID: \"42669ce8-99eb-46ab-ac1e-ef126adaad60\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mgrxf" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.650744 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/42669ce8-99eb-46ab-ac1e-ef126adaad60-etcd-client\") pod \"etcd-operator-b45778765-mgrxf\" (UID: \"42669ce8-99eb-46ab-ac1e-ef126adaad60\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mgrxf" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.650822 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-944wg"] Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.652581 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-944wg" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.653134 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-qzcjb"] Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.656190 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.656614 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-n8s5c"] Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.660837 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.661388 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-n8s5c" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.664593 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.665356 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-lj8jk"] Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.671023 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vpzz5"] Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.673595 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5qzwb"] Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.674994 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-86tqb"] Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.676056 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ctjpc"] Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.682190 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-fg6bd"] Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.684646 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-6rbbh"] Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.687865 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-s7kmk"] Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.689099 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-q894z"] Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.690270 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.692288 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7kckm"] Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.696610 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9kz76"] Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.697864 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.699501 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-55l96"] Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.700657 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-chc9w"] Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.702759 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-g82z9"] Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.703928 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-dgnq8"] Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.705012 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-vjgs5"] Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.706028 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-vjgs5" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.706184 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-88zb5"] Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.706749 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-88zb5" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.707178 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-sqwgl"] Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.708462 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-846bd"] Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.709567 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6rksb"] Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.710595 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-ggdkr"] Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.711734 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hksfk"] Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.712832 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-ncsct"] Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.713837 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-6k7j5"] Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.714906 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-hq42n"] Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.715979 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dvv45"] Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.717096 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-vjgs5"] Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.718134 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-lsjl6"] Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.718875 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.719268 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4l4lr"] Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.720470 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-dttlz"] Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.721125 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b5a72330-c8e5-4be3-8083-ced47a7b6ada-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-55l96\" (UID: \"b5a72330-c8e5-4be3-8083-ced47a7b6ada\") " pod="openshift-authentication/oauth-openshift-558db77b4-55l96" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.721180 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7kkpj\" (UniqueName: \"kubernetes.io/projected/b5a72330-c8e5-4be3-8083-ced47a7b6ada-kube-api-access-7kkpj\") pod \"oauth-openshift-558db77b4-55l96\" (UID: \"b5a72330-c8e5-4be3-8083-ced47a7b6ada\") " pod="openshift-authentication/oauth-openshift-558db77b4-55l96" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.721232 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdrgx\" (UniqueName: \"kubernetes.io/projected/1b4bbcb2-7564-4018-9d41-2ecd0cc7a80f-kube-api-access-wdrgx\") pod \"machine-config-operator-74547568cd-hq42n\" (UID: \"1b4bbcb2-7564-4018-9d41-2ecd0cc7a80f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hq42n" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.721264 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b5a72330-c8e5-4be3-8083-ced47a7b6ada-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-55l96\" (UID: \"b5a72330-c8e5-4be3-8083-ced47a7b6ada\") " pod="openshift-authentication/oauth-openshift-558db77b4-55l96" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.721307 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/82473df6-10c6-4c55-9eb0-c0a98830ff79-audit-policies\") pod \"apiserver-7bbb656c7d-fg6bd\" (UID: \"82473df6-10c6-4c55-9eb0-c0a98830ff79\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fg6bd" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.721333 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/82473df6-10c6-4c55-9eb0-c0a98830ff79-audit-dir\") pod \"apiserver-7bbb656c7d-fg6bd\" (UID: \"82473df6-10c6-4c55-9eb0-c0a98830ff79\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fg6bd" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.721361 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b5a72330-c8e5-4be3-8083-ced47a7b6ada-audit-policies\") pod \"oauth-openshift-558db77b4-55l96\" (UID: \"b5a72330-c8e5-4be3-8083-ced47a7b6ada\") " pod="openshift-authentication/oauth-openshift-558db77b4-55l96" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.721400 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b5a72330-c8e5-4be3-8083-ced47a7b6ada-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-55l96\" (UID: \"b5a72330-c8e5-4be3-8083-ced47a7b6ada\") " pod="openshift-authentication/oauth-openshift-558db77b4-55l96" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.721429 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/62c9795a-67ff-43f2-a303-fdc91be1494d-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-hksfk\" (UID: \"62c9795a-67ff-43f2-a303-fdc91be1494d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hksfk" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.721459 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b5a72330-c8e5-4be3-8083-ced47a7b6ada-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-55l96\" (UID: \"b5a72330-c8e5-4be3-8083-ced47a7b6ada\") " pod="openshift-authentication/oauth-openshift-558db77b4-55l96" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.721493 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/76908270-f3ef-412b-a18d-065cf006461e-audit-dir\") pod \"apiserver-76f77b778f-s7kmk\" (UID: \"76908270-f3ef-412b-a18d-065cf006461e\") " pod="openshift-apiserver/apiserver-76f77b778f-s7kmk" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.721499 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/82473df6-10c6-4c55-9eb0-c0a98830ff79-audit-dir\") pod \"apiserver-7bbb656c7d-fg6bd\" (UID: \"82473df6-10c6-4c55-9eb0-c0a98830ff79\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fg6bd" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.721535 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/62c9795a-67ff-43f2-a303-fdc91be1494d-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-hksfk\" (UID: \"62c9795a-67ff-43f2-a303-fdc91be1494d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hksfk" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.721565 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/76908270-f3ef-412b-a18d-065cf006461e-etcd-serving-ca\") pod \"apiserver-76f77b778f-s7kmk\" (UID: \"76908270-f3ef-412b-a18d-065cf006461e\") " pod="openshift-apiserver/apiserver-76f77b778f-s7kmk" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.721591 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/82473df6-10c6-4c55-9eb0-c0a98830ff79-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-fg6bd\" (UID: \"82473df6-10c6-4c55-9eb0-c0a98830ff79\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fg6bd" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.721701 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqb9k\" (UniqueName: \"kubernetes.io/projected/e92c9592-4e6c-4589-809d-18de894b6352-kube-api-access-tqb9k\") pod \"kube-storage-version-migrator-operator-b67b599dd-4l4lr\" (UID: \"e92c9592-4e6c-4589-809d-18de894b6352\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4l4lr" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.721739 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1b4bbcb2-7564-4018-9d41-2ecd0cc7a80f-images\") pod \"machine-config-operator-74547568cd-hq42n\" (UID: \"1b4bbcb2-7564-4018-9d41-2ecd0cc7a80f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hq42n" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.721794 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2cd9132b-47d7-4a19-841b-cc409d55c5ae-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6rksb\" (UID: \"2cd9132b-47d7-4a19-841b-cc409d55c5ae\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6rksb" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.721827 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b5a72330-c8e5-4be3-8083-ced47a7b6ada-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-55l96\" (UID: \"b5a72330-c8e5-4be3-8083-ced47a7b6ada\") " pod="openshift-authentication/oauth-openshift-558db77b4-55l96" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.721857 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b5a72330-c8e5-4be3-8083-ced47a7b6ada-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-55l96\" (UID: \"b5a72330-c8e5-4be3-8083-ced47a7b6ada\") " pod="openshift-authentication/oauth-openshift-558db77b4-55l96" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.721897 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qkzf\" (UniqueName: \"kubernetes.io/projected/8c66cb83-e800-414f-a637-18fd6c6423e5-kube-api-access-9qkzf\") pod \"ingress-operator-5b745b69d9-ncsct\" (UID: \"8c66cb83-e800-414f-a637-18fd6c6423e5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ncsct" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.721958 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b5a72330-c8e5-4be3-8083-ced47a7b6ada-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-55l96\" (UID: \"b5a72330-c8e5-4be3-8083-ced47a7b6ada\") " pod="openshift-authentication/oauth-openshift-558db77b4-55l96" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.721990 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b5a72330-c8e5-4be3-8083-ced47a7b6ada-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-55l96\" (UID: \"b5a72330-c8e5-4be3-8083-ced47a7b6ada\") " pod="openshift-authentication/oauth-openshift-558db77b4-55l96" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.722020 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2cd9132b-47d7-4a19-841b-cc409d55c5ae-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6rksb\" (UID: \"2cd9132b-47d7-4a19-841b-cc409d55c5ae\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6rksb" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.722053 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8c66cb83-e800-414f-a637-18fd6c6423e5-trusted-ca\") pod \"ingress-operator-5b745b69d9-ncsct\" (UID: \"8c66cb83-e800-414f-a637-18fd6c6423e5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ncsct" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.722080 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/82473df6-10c6-4c55-9eb0-c0a98830ff79-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-fg6bd\" (UID: \"82473df6-10c6-4c55-9eb0-c0a98830ff79\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fg6bd" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.722124 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gn7g\" (UniqueName: \"kubernetes.io/projected/da37a076-9ec6-4649-950a-db70d7802daf-kube-api-access-4gn7g\") pod \"openshift-apiserver-operator-796bbdcf4f-g82z9\" (UID: \"da37a076-9ec6-4649-950a-db70d7802daf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-g82z9" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.722154 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktwfb\" (UniqueName: \"kubernetes.io/projected/82473df6-10c6-4c55-9eb0-c0a98830ff79-kube-api-access-ktwfb\") pod \"apiserver-7bbb656c7d-fg6bd\" (UID: \"82473df6-10c6-4c55-9eb0-c0a98830ff79\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fg6bd" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.722198 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b5a72330-c8e5-4be3-8083-ced47a7b6ada-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-55l96\" (UID: \"b5a72330-c8e5-4be3-8083-ced47a7b6ada\") " pod="openshift-authentication/oauth-openshift-558db77b4-55l96" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.722226 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmj4d\" (UniqueName: \"kubernetes.io/projected/76908270-f3ef-412b-a18d-065cf006461e-kube-api-access-rmj4d\") pod \"apiserver-76f77b778f-s7kmk\" (UID: \"76908270-f3ef-412b-a18d-065cf006461e\") " pod="openshift-apiserver/apiserver-76f77b778f-s7kmk" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.722259 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e92c9592-4e6c-4589-809d-18de894b6352-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-4l4lr\" (UID: \"e92c9592-4e6c-4589-809d-18de894b6352\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4l4lr" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.722287 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cd9132b-47d7-4a19-841b-cc409d55c5ae-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6rksb\" (UID: \"2cd9132b-47d7-4a19-841b-cc409d55c5ae\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6rksb" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.722347 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8wtv\" (UniqueName: \"kubernetes.io/projected/02491684-b05b-4878-b335-ba69ffbe08c9-kube-api-access-w8wtv\") pod \"control-plane-machine-set-operator-78cbb6b69f-846bd\" (UID: \"02491684-b05b-4878-b335-ba69ffbe08c9\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-846bd" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.722404 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1b4bbcb2-7564-4018-9d41-2ecd0cc7a80f-auth-proxy-config\") pod \"machine-config-operator-74547568cd-hq42n\" (UID: \"1b4bbcb2-7564-4018-9d41-2ecd0cc7a80f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hq42n" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.722440 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-n5h4j"] Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.722506 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62c9795a-67ff-43f2-a303-fdc91be1494d-config\") pod \"kube-controller-manager-operator-78b949d7b-hksfk\" (UID: \"62c9795a-67ff-43f2-a303-fdc91be1494d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hksfk" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.722659 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/76908270-f3ef-412b-a18d-065cf006461e-node-pullsecrets\") pod \"apiserver-76f77b778f-s7kmk\" (UID: \"76908270-f3ef-412b-a18d-065cf006461e\") " pod="openshift-apiserver/apiserver-76f77b778f-s7kmk" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.722775 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6mgn\" (UniqueName: \"kubernetes.io/projected/9ed87087-af1e-4042-b3ba-000095c2b183-kube-api-access-c6mgn\") pod \"machine-config-controller-84d6567774-ggdkr\" (UID: \"9ed87087-af1e-4042-b3ba-000095c2b183\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ggdkr" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.722021 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/82473df6-10c6-4c55-9eb0-c0a98830ff79-audit-policies\") pod \"apiserver-7bbb656c7d-fg6bd\" (UID: \"82473df6-10c6-4c55-9eb0-c0a98830ff79\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fg6bd" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.722883 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/82473df6-10c6-4c55-9eb0-c0a98830ff79-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-fg6bd\" (UID: \"82473df6-10c6-4c55-9eb0-c0a98830ff79\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fg6bd" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.722893 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/76908270-f3ef-412b-a18d-065cf006461e-node-pullsecrets\") pod \"apiserver-76f77b778f-s7kmk\" (UID: \"76908270-f3ef-412b-a18d-065cf006461e\") " pod="openshift-apiserver/apiserver-76f77b778f-s7kmk" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.722522 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b5a72330-c8e5-4be3-8083-ced47a7b6ada-audit-policies\") pod \"oauth-openshift-558db77b4-55l96\" (UID: \"b5a72330-c8e5-4be3-8083-ced47a7b6ada\") " pod="openshift-authentication/oauth-openshift-558db77b4-55l96" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.723104 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/02491684-b05b-4878-b335-ba69ffbe08c9-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-846bd\" (UID: \"02491684-b05b-4878-b335-ba69ffbe08c9\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-846bd" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.722820 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29521170-6rzxb"] Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.723430 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/82473df6-10c6-4c55-9eb0-c0a98830ff79-encryption-config\") pod \"apiserver-7bbb656c7d-fg6bd\" (UID: \"82473df6-10c6-4c55-9eb0-c0a98830ff79\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fg6bd" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.723478 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b5a72330-c8e5-4be3-8083-ced47a7b6ada-audit-dir\") pod \"oauth-openshift-558db77b4-55l96\" (UID: \"b5a72330-c8e5-4be3-8083-ced47a7b6ada\") " pod="openshift-authentication/oauth-openshift-558db77b4-55l96" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.723533 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/76908270-f3ef-412b-a18d-065cf006461e-image-import-ca\") pod \"apiserver-76f77b778f-s7kmk\" (UID: \"76908270-f3ef-412b-a18d-065cf006461e\") " pod="openshift-apiserver/apiserver-76f77b778f-s7kmk" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.723578 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e92c9592-4e6c-4589-809d-18de894b6352-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-4l4lr\" (UID: \"e92c9592-4e6c-4589-809d-18de894b6352\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4l4lr" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.723591 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/76908270-f3ef-412b-a18d-065cf006461e-audit-dir\") pod \"apiserver-76f77b778f-s7kmk\" (UID: \"76908270-f3ef-412b-a18d-065cf006461e\") " pod="openshift-apiserver/apiserver-76f77b778f-s7kmk" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.723607 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/76908270-f3ef-412b-a18d-065cf006461e-etcd-client\") pod \"apiserver-76f77b778f-s7kmk\" (UID: \"76908270-f3ef-412b-a18d-065cf006461e\") " pod="openshift-apiserver/apiserver-76f77b778f-s7kmk" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.723642 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9ed87087-af1e-4042-b3ba-000095c2b183-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-ggdkr\" (UID: \"9ed87087-af1e-4042-b3ba-000095c2b183\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ggdkr" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.723670 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1b4bbcb2-7564-4018-9d41-2ecd0cc7a80f-proxy-tls\") pod \"machine-config-operator-74547568cd-hq42n\" (UID: \"1b4bbcb2-7564-4018-9d41-2ecd0cc7a80f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hq42n" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.723707 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8c66cb83-e800-414f-a637-18fd6c6423e5-metrics-tls\") pod \"ingress-operator-5b745b69d9-ncsct\" (UID: \"8c66cb83-e800-414f-a637-18fd6c6423e5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ncsct" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.723727 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da37a076-9ec6-4649-950a-db70d7802daf-config\") pod \"openshift-apiserver-operator-796bbdcf4f-g82z9\" (UID: \"da37a076-9ec6-4649-950a-db70d7802daf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-g82z9" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.723746 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/82473df6-10c6-4c55-9eb0-c0a98830ff79-serving-cert\") pod \"apiserver-7bbb656c7d-fg6bd\" (UID: \"82473df6-10c6-4c55-9eb0-c0a98830ff79\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fg6bd" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.723766 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/76908270-f3ef-412b-a18d-065cf006461e-audit\") pod \"apiserver-76f77b778f-s7kmk\" (UID: \"76908270-f3ef-412b-a18d-065cf006461e\") " pod="openshift-apiserver/apiserver-76f77b778f-s7kmk" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.723812 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9ed87087-af1e-4042-b3ba-000095c2b183-proxy-tls\") pod \"machine-config-controller-84d6567774-ggdkr\" (UID: \"9ed87087-af1e-4042-b3ba-000095c2b183\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ggdkr" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.723832 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da37a076-9ec6-4649-950a-db70d7802daf-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-g82z9\" (UID: \"da37a076-9ec6-4649-950a-db70d7802daf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-g82z9" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.723876 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b5a72330-c8e5-4be3-8083-ced47a7b6ada-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-55l96\" (UID: \"b5a72330-c8e5-4be3-8083-ced47a7b6ada\") " pod="openshift-authentication/oauth-openshift-558db77b4-55l96" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.723895 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76908270-f3ef-412b-a18d-065cf006461e-serving-cert\") pod \"apiserver-76f77b778f-s7kmk\" (UID: \"76908270-f3ef-412b-a18d-065cf006461e\") " pod="openshift-apiserver/apiserver-76f77b778f-s7kmk" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.723936 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8c66cb83-e800-414f-a637-18fd6c6423e5-bound-sa-token\") pod \"ingress-operator-5b745b69d9-ncsct\" (UID: \"8c66cb83-e800-414f-a637-18fd6c6423e5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ncsct" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.723978 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/76908270-f3ef-412b-a18d-065cf006461e-encryption-config\") pod \"apiserver-76f77b778f-s7kmk\" (UID: \"76908270-f3ef-412b-a18d-065cf006461e\") " pod="openshift-apiserver/apiserver-76f77b778f-s7kmk" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.724005 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b5a72330-c8e5-4be3-8083-ced47a7b6ada-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-55l96\" (UID: \"b5a72330-c8e5-4be3-8083-ced47a7b6ada\") " pod="openshift-authentication/oauth-openshift-558db77b4-55l96" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.724030 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/76908270-f3ef-412b-a18d-065cf006461e-trusted-ca-bundle\") pod \"apiserver-76f77b778f-s7kmk\" (UID: \"76908270-f3ef-412b-a18d-065cf006461e\") " pod="openshift-apiserver/apiserver-76f77b778f-s7kmk" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.724071 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/82473df6-10c6-4c55-9eb0-c0a98830ff79-etcd-client\") pod \"apiserver-7bbb656c7d-fg6bd\" (UID: \"82473df6-10c6-4c55-9eb0-c0a98830ff79\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fg6bd" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.724089 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76908270-f3ef-412b-a18d-065cf006461e-config\") pod \"apiserver-76f77b778f-s7kmk\" (UID: \"76908270-f3ef-412b-a18d-065cf006461e\") " pod="openshift-apiserver/apiserver-76f77b778f-s7kmk" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.724738 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b5a72330-c8e5-4be3-8083-ced47a7b6ada-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-55l96\" (UID: \"b5a72330-c8e5-4be3-8083-ced47a7b6ada\") " pod="openshift-authentication/oauth-openshift-558db77b4-55l96" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.725021 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b5a72330-c8e5-4be3-8083-ced47a7b6ada-audit-dir\") pod \"oauth-openshift-558db77b4-55l96\" (UID: \"b5a72330-c8e5-4be3-8083-ced47a7b6ada\") " pod="openshift-authentication/oauth-openshift-558db77b4-55l96" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.725211 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76908270-f3ef-412b-a18d-065cf006461e-config\") pod \"apiserver-76f77b778f-s7kmk\" (UID: \"76908270-f3ef-412b-a18d-065cf006461e\") " pod="openshift-apiserver/apiserver-76f77b778f-s7kmk" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.725618 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-6sj8v"] Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.725645 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-944wg"] Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.725863 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8c66cb83-e800-414f-a637-18fd6c6423e5-trusted-ca\") pod \"ingress-operator-5b745b69d9-ncsct\" (UID: \"8c66cb83-e800-414f-a637-18fd6c6423e5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ncsct" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.726115 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/76908270-f3ef-412b-a18d-065cf006461e-image-import-ca\") pod \"apiserver-76f77b778f-s7kmk\" (UID: \"76908270-f3ef-412b-a18d-065cf006461e\") " pod="openshift-apiserver/apiserver-76f77b778f-s7kmk" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.726241 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/82473df6-10c6-4c55-9eb0-c0a98830ff79-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-fg6bd\" (UID: \"82473df6-10c6-4c55-9eb0-c0a98830ff79\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fg6bd" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.726489 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/76908270-f3ef-412b-a18d-065cf006461e-etcd-serving-ca\") pod \"apiserver-76f77b778f-s7kmk\" (UID: \"76908270-f3ef-412b-a18d-065cf006461e\") " pod="openshift-apiserver/apiserver-76f77b778f-s7kmk" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.726530 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-p2j88"] Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.727971 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6kj4n"] Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.729077 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b5a72330-c8e5-4be3-8083-ced47a7b6ada-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-55l96\" (UID: \"b5a72330-c8e5-4be3-8083-ced47a7b6ada\") " pod="openshift-authentication/oauth-openshift-558db77b4-55l96" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.729574 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/76908270-f3ef-412b-a18d-065cf006461e-audit\") pod \"apiserver-76f77b778f-s7kmk\" (UID: \"76908270-f3ef-412b-a18d-065cf006461e\") " pod="openshift-apiserver/apiserver-76f77b778f-s7kmk" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.729599 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/82473df6-10c6-4c55-9eb0-c0a98830ff79-serving-cert\") pod \"apiserver-7bbb656c7d-fg6bd\" (UID: \"82473df6-10c6-4c55-9eb0-c0a98830ff79\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fg6bd" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.729935 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-n8s5c"] Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.729943 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/76908270-f3ef-412b-a18d-065cf006461e-trusted-ca-bundle\") pod \"apiserver-76f77b778f-s7kmk\" (UID: \"76908270-f3ef-412b-a18d-065cf006461e\") " pod="openshift-apiserver/apiserver-76f77b778f-s7kmk" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.730059 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/76908270-f3ef-412b-a18d-065cf006461e-etcd-client\") pod \"apiserver-76f77b778f-s7kmk\" (UID: \"76908270-f3ef-412b-a18d-065cf006461e\") " pod="openshift-apiserver/apiserver-76f77b778f-s7kmk" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.730407 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9ed87087-af1e-4042-b3ba-000095c2b183-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-ggdkr\" (UID: \"9ed87087-af1e-4042-b3ba-000095c2b183\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ggdkr" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.730441 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b5a72330-c8e5-4be3-8083-ced47a7b6ada-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-55l96\" (UID: \"b5a72330-c8e5-4be3-8083-ced47a7b6ada\") " pod="openshift-authentication/oauth-openshift-558db77b4-55l96" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.730472 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8c66cb83-e800-414f-a637-18fd6c6423e5-metrics-tls\") pod \"ingress-operator-5b745b69d9-ncsct\" (UID: \"8c66cb83-e800-414f-a637-18fd6c6423e5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ncsct" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.731053 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1b4bbcb2-7564-4018-9d41-2ecd0cc7a80f-auth-proxy-config\") pod \"machine-config-operator-74547568cd-hq42n\" (UID: \"1b4bbcb2-7564-4018-9d41-2ecd0cc7a80f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hq42n" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.731055 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b5a72330-c8e5-4be3-8083-ced47a7b6ada-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-55l96\" (UID: \"b5a72330-c8e5-4be3-8083-ced47a7b6ada\") " pod="openshift-authentication/oauth-openshift-558db77b4-55l96" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.732294 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b5a72330-c8e5-4be3-8083-ced47a7b6ada-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-55l96\" (UID: \"b5a72330-c8e5-4be3-8083-ced47a7b6ada\") " pod="openshift-authentication/oauth-openshift-558db77b4-55l96" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.732337 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-7grkh"] Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.732465 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b5a72330-c8e5-4be3-8083-ced47a7b6ada-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-55l96\" (UID: \"b5a72330-c8e5-4be3-8083-ced47a7b6ada\") " pod="openshift-authentication/oauth-openshift-558db77b4-55l96" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.733190 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b5a72330-c8e5-4be3-8083-ced47a7b6ada-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-55l96\" (UID: \"b5a72330-c8e5-4be3-8083-ced47a7b6ada\") " pod="openshift-authentication/oauth-openshift-558db77b4-55l96" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.733437 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b5a72330-c8e5-4be3-8083-ced47a7b6ada-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-55l96\" (UID: \"b5a72330-c8e5-4be3-8083-ced47a7b6ada\") " pod="openshift-authentication/oauth-openshift-558db77b4-55l96" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.733766 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/02491684-b05b-4878-b335-ba69ffbe08c9-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-846bd\" (UID: \"02491684-b05b-4878-b335-ba69ffbe08c9\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-846bd" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.733784 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/82473df6-10c6-4c55-9eb0-c0a98830ff79-etcd-client\") pod \"apiserver-7bbb656c7d-fg6bd\" (UID: \"82473df6-10c6-4c55-9eb0-c0a98830ff79\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fg6bd" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.733823 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-7grkh" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.733989 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-7grkh"] Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.733909 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/76908270-f3ef-412b-a18d-065cf006461e-encryption-config\") pod \"apiserver-76f77b778f-s7kmk\" (UID: \"76908270-f3ef-412b-a18d-065cf006461e\") " pod="openshift-apiserver/apiserver-76f77b778f-s7kmk" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.735011 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76908270-f3ef-412b-a18d-065cf006461e-serving-cert\") pod \"apiserver-76f77b778f-s7kmk\" (UID: \"76908270-f3ef-412b-a18d-065cf006461e\") " pod="openshift-apiserver/apiserver-76f77b778f-s7kmk" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.735218 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/82473df6-10c6-4c55-9eb0-c0a98830ff79-encryption-config\") pod \"apiserver-7bbb656c7d-fg6bd\" (UID: \"82473df6-10c6-4c55-9eb0-c0a98830ff79\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fg6bd" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.739372 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.742158 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b5a72330-c8e5-4be3-8083-ced47a7b6ada-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-55l96\" (UID: \"b5a72330-c8e5-4be3-8083-ced47a7b6ada\") " pod="openshift-authentication/oauth-openshift-558db77b4-55l96" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.758637 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.766073 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b5a72330-c8e5-4be3-8083-ced47a7b6ada-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-55l96\" (UID: \"b5a72330-c8e5-4be3-8083-ced47a7b6ada\") " pod="openshift-authentication/oauth-openshift-558db77b4-55l96" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.786067 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.789431 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b5a72330-c8e5-4be3-8083-ced47a7b6ada-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-55l96\" (UID: \"b5a72330-c8e5-4be3-8083-ced47a7b6ada\") " pod="openshift-authentication/oauth-openshift-558db77b4-55l96" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.799591 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.819791 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.838832 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.846047 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1b4bbcb2-7564-4018-9d41-2ecd0cc7a80f-images\") pod \"machine-config-operator-74547568cd-hq42n\" (UID: \"1b4bbcb2-7564-4018-9d41-2ecd0cc7a80f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hq42n" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.861224 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.865743 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62c9795a-67ff-43f2-a303-fdc91be1494d-config\") pod \"kube-controller-manager-operator-78b949d7b-hksfk\" (UID: \"62c9795a-67ff-43f2-a303-fdc91be1494d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hksfk" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.879903 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.899518 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.908651 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1b4bbcb2-7564-4018-9d41-2ecd0cc7a80f-proxy-tls\") pod \"machine-config-operator-74547568cd-hq42n\" (UID: \"1b4bbcb2-7564-4018-9d41-2ecd0cc7a80f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hq42n" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.919726 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.926240 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/62c9795a-67ff-43f2-a303-fdc91be1494d-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-hksfk\" (UID: \"62c9795a-67ff-43f2-a303-fdc91be1494d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hksfk" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.940108 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.960302 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.979393 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 16 19:44:18 crc kubenswrapper[4675]: I0216 19:44:18.986956 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2cd9132b-47d7-4a19-841b-cc409d55c5ae-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6rksb\" (UID: \"2cd9132b-47d7-4a19-841b-cc409d55c5ae\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6rksb" Feb 16 19:44:19 crc kubenswrapper[4675]: I0216 19:44:19.019753 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 16 19:44:19 crc kubenswrapper[4675]: I0216 19:44:19.040050 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 16 19:44:19 crc kubenswrapper[4675]: I0216 19:44:19.059867 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 16 19:44:19 crc kubenswrapper[4675]: I0216 19:44:19.067195 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cd9132b-47d7-4a19-841b-cc409d55c5ae-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6rksb\" (UID: \"2cd9132b-47d7-4a19-841b-cc409d55c5ae\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6rksb" Feb 16 19:44:19 crc kubenswrapper[4675]: I0216 19:44:19.078839 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 16 19:44:19 crc kubenswrapper[4675]: I0216 19:44:19.099549 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 16 19:44:19 crc kubenswrapper[4675]: I0216 19:44:19.119872 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 16 19:44:19 crc kubenswrapper[4675]: I0216 19:44:19.134528 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da37a076-9ec6-4649-950a-db70d7802daf-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-g82z9\" (UID: \"da37a076-9ec6-4649-950a-db70d7802daf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-g82z9" Feb 16 19:44:19 crc kubenswrapper[4675]: I0216 19:44:19.140043 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 16 19:44:19 crc kubenswrapper[4675]: I0216 19:44:19.146850 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da37a076-9ec6-4649-950a-db70d7802daf-config\") pod \"openshift-apiserver-operator-796bbdcf4f-g82z9\" (UID: \"da37a076-9ec6-4649-950a-db70d7802daf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-g82z9" Feb 16 19:44:19 crc kubenswrapper[4675]: I0216 19:44:19.160364 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 16 19:44:19 crc kubenswrapper[4675]: I0216 19:44:19.180539 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 16 19:44:19 crc kubenswrapper[4675]: I0216 19:44:19.199479 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 16 19:44:19 crc kubenswrapper[4675]: I0216 19:44:19.219019 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 16 19:44:19 crc kubenswrapper[4675]: I0216 19:44:19.224412 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e92c9592-4e6c-4589-809d-18de894b6352-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-4l4lr\" (UID: \"e92c9592-4e6c-4589-809d-18de894b6352\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4l4lr" Feb 16 19:44:19 crc kubenswrapper[4675]: I0216 19:44:19.240457 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 16 19:44:19 crc kubenswrapper[4675]: I0216 19:44:19.245222 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e92c9592-4e6c-4589-809d-18de894b6352-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-4l4lr\" (UID: \"e92c9592-4e6c-4589-809d-18de894b6352\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4l4lr" Feb 16 19:44:19 crc kubenswrapper[4675]: I0216 19:44:19.259753 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 16 19:44:19 crc kubenswrapper[4675]: I0216 19:44:19.279510 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 16 19:44:19 crc kubenswrapper[4675]: I0216 19:44:19.283257 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9ed87087-af1e-4042-b3ba-000095c2b183-proxy-tls\") pod \"machine-config-controller-84d6567774-ggdkr\" (UID: \"9ed87087-af1e-4042-b3ba-000095c2b183\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ggdkr" Feb 16 19:44:19 crc kubenswrapper[4675]: I0216 19:44:19.300226 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 16 19:44:19 crc kubenswrapper[4675]: I0216 19:44:19.340818 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 16 19:44:19 crc kubenswrapper[4675]: I0216 19:44:19.360804 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 16 19:44:19 crc kubenswrapper[4675]: I0216 19:44:19.380176 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 16 19:44:19 crc kubenswrapper[4675]: I0216 19:44:19.400110 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 16 19:44:19 crc kubenswrapper[4675]: I0216 19:44:19.419614 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 16 19:44:19 crc kubenswrapper[4675]: I0216 19:44:19.438912 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 16 19:44:19 crc kubenswrapper[4675]: I0216 19:44:19.460103 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 16 19:44:19 crc kubenswrapper[4675]: I0216 19:44:19.480149 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 16 19:44:19 crc kubenswrapper[4675]: I0216 19:44:19.499186 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 16 19:44:19 crc kubenswrapper[4675]: I0216 19:44:19.519228 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 16 19:44:19 crc kubenswrapper[4675]: I0216 19:44:19.539487 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 16 19:44:19 crc kubenswrapper[4675]: I0216 19:44:19.557965 4675 request.go:700] Waited for 1.015740407s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/secrets?fieldSelector=metadata.name%3Dolm-operator-serving-cert&limit=500&resourceVersion=0 Feb 16 19:44:19 crc kubenswrapper[4675]: I0216 19:44:19.563151 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 16 19:44:19 crc kubenswrapper[4675]: I0216 19:44:19.580337 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 16 19:44:19 crc kubenswrapper[4675]: I0216 19:44:19.600084 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 16 19:44:19 crc kubenswrapper[4675]: I0216 19:44:19.620119 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 16 19:44:19 crc kubenswrapper[4675]: I0216 19:44:19.640323 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 16 19:44:19 crc kubenswrapper[4675]: I0216 19:44:19.660311 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 16 19:44:19 crc kubenswrapper[4675]: I0216 19:44:19.679296 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 16 19:44:19 crc kubenswrapper[4675]: I0216 19:44:19.699726 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 16 19:44:19 crc kubenswrapper[4675]: I0216 19:44:19.719531 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 16 19:44:19 crc kubenswrapper[4675]: I0216 19:44:19.739309 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 16 19:44:19 crc kubenswrapper[4675]: I0216 19:44:19.766968 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 16 19:44:19 crc kubenswrapper[4675]: I0216 19:44:19.779312 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 16 19:44:19 crc kubenswrapper[4675]: I0216 19:44:19.799596 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 16 19:44:19 crc kubenswrapper[4675]: I0216 19:44:19.819674 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 16 19:44:19 crc kubenswrapper[4675]: I0216 19:44:19.839931 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 16 19:44:19 crc kubenswrapper[4675]: I0216 19:44:19.859994 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 16 19:44:19 crc kubenswrapper[4675]: I0216 19:44:19.879901 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 16 19:44:19 crc kubenswrapper[4675]: I0216 19:44:19.900438 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 16 19:44:19 crc kubenswrapper[4675]: I0216 19:44:19.919492 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 16 19:44:19 crc kubenswrapper[4675]: I0216 19:44:19.939568 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 16 19:44:19 crc kubenswrapper[4675]: I0216 19:44:19.960022 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 16 19:44:19 crc kubenswrapper[4675]: I0216 19:44:19.979457 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 16 19:44:19 crc kubenswrapper[4675]: I0216 19:44:19.999242 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 16 19:44:20 crc kubenswrapper[4675]: I0216 19:44:20.019962 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 16 19:44:20 crc kubenswrapper[4675]: I0216 19:44:20.039072 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 16 19:44:20 crc kubenswrapper[4675]: I0216 19:44:20.083291 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 16 19:44:20 crc kubenswrapper[4675]: I0216 19:44:20.087264 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xnd2\" (UniqueName: \"kubernetes.io/projected/7dc12320-ba2f-4683-b3fd-ed8cb9ef07c4-kube-api-access-5xnd2\") pod \"authentication-operator-69f744f599-dgnq8\" (UID: \"7dc12320-ba2f-4683-b3fd-ed8cb9ef07c4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dgnq8" Feb 16 19:44:20 crc kubenswrapper[4675]: I0216 19:44:20.117169 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9qgj\" (UniqueName: \"kubernetes.io/projected/5d5f11f1-66e4-4700-9c37-f3843d1769a5-kube-api-access-z9qgj\") pod \"downloads-7954f5f757-6k7j5\" (UID: \"5d5f11f1-66e4-4700-9c37-f3843d1769a5\") " pod="openshift-console/downloads-7954f5f757-6k7j5" Feb 16 19:44:20 crc kubenswrapper[4675]: I0216 19:44:20.133949 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fd12d696-66f6-4a7c-b6a4-75659b2c1a3b-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-vpzz5\" (UID: \"fd12d696-66f6-4a7c-b6a4-75659b2c1a3b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vpzz5" Feb 16 19:44:20 crc kubenswrapper[4675]: I0216 19:44:20.162264 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xxjh\" (UniqueName: \"kubernetes.io/projected/061097a9-32e9-4972-989a-f3777a41bc2b-kube-api-access-4xxjh\") pod \"console-f9d7485db-chc9w\" (UID: \"061097a9-32e9-4972-989a-f3777a41bc2b\") " pod="openshift-console/console-f9d7485db-chc9w" Feb 16 19:44:20 crc kubenswrapper[4675]: I0216 19:44:20.180403 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9wlg\" (UniqueName: \"kubernetes.io/projected/b2792f84-b979-482e-812c-00eadcc75958-kube-api-access-v9wlg\") pod \"openshift-controller-manager-operator-756b6f6bc6-9kz76\" (UID: \"b2792f84-b979-482e-812c-00eadcc75958\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9kz76" Feb 16 19:44:20 crc kubenswrapper[4675]: I0216 19:44:20.193996 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vpzz5" Feb 16 19:44:20 crc kubenswrapper[4675]: I0216 19:44:20.195393 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6rg4\" (UniqueName: \"kubernetes.io/projected/d9de00f8-6995-42d6-ad28-1961096e55c0-kube-api-access-n6rg4\") pod \"controller-manager-879f6c89f-qzcjb\" (UID: \"d9de00f8-6995-42d6-ad28-1961096e55c0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-qzcjb" Feb 16 19:44:20 crc kubenswrapper[4675]: I0216 19:44:20.211000 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-qzcjb" Feb 16 19:44:20 crc kubenswrapper[4675]: I0216 19:44:20.214594 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnnlq\" (UniqueName: \"kubernetes.io/projected/f80b9c97-1cb5-427f-9b9f-4b62316d9e7a-kube-api-access-vnnlq\") pod \"route-controller-manager-6576b87f9c-q894z\" (UID: \"f80b9c97-1cb5-427f-9b9f-4b62316d9e7a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q894z" Feb 16 19:44:20 crc kubenswrapper[4675]: I0216 19:44:20.219195 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q894z" Feb 16 19:44:20 crc kubenswrapper[4675]: I0216 19:44:20.237883 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f9b2f6cf-9260-43ee-a1bd-3d04b6e0b666-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-5qzwb\" (UID: \"f9b2f6cf-9260-43ee-a1bd-3d04b6e0b666\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5qzwb" Feb 16 19:44:20 crc kubenswrapper[4675]: I0216 19:44:20.239687 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 16 19:44:20 crc kubenswrapper[4675]: I0216 19:44:20.276632 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2sg9k\" (UniqueName: \"kubernetes.io/projected/42669ce8-99eb-46ab-ac1e-ef126adaad60-kube-api-access-2sg9k\") pod \"etcd-operator-b45778765-mgrxf\" (UID: \"42669ce8-99eb-46ab-ac1e-ef126adaad60\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mgrxf" Feb 16 19:44:20 crc kubenswrapper[4675]: I0216 19:44:20.299377 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjm2v\" (UniqueName: \"kubernetes.io/projected/10890572-a59e-434e-9de6-cdc91e7ffa50-kube-api-access-pjm2v\") pod \"console-operator-58897d9998-sqwgl\" (UID: \"10890572-a59e-434e-9de6-cdc91e7ffa50\") " pod="openshift-console-operator/console-operator-58897d9998-sqwgl" Feb 16 19:44:20 crc kubenswrapper[4675]: I0216 19:44:20.303369 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-dgnq8" Feb 16 19:44:20 crc kubenswrapper[4675]: I0216 19:44:20.315275 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9f97b\" (UniqueName: \"kubernetes.io/projected/635463ec-8d07-4e48-973a-f219b530f144-kube-api-access-9f97b\") pod \"migrator-59844c95c7-6rbbh\" (UID: \"635463ec-8d07-4e48-973a-f219b530f144\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6rbbh" Feb 16 19:44:20 crc kubenswrapper[4675]: I0216 19:44:20.364889 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d42zc\" (UniqueName: \"kubernetes.io/projected/4178aeb7-6531-47e8-bf0d-695fbb18bc89-kube-api-access-d42zc\") pod \"openshift-config-operator-7777fb866f-dttlz\" (UID: \"4178aeb7-6531-47e8-bf0d-695fbb18bc89\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dttlz" Feb 16 19:44:20 crc kubenswrapper[4675]: I0216 19:44:20.374785 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-6k7j5" Feb 16 19:44:20 crc kubenswrapper[4675]: I0216 19:44:20.376180 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2dsg\" (UniqueName: \"kubernetes.io/projected/cec4d170-3bfa-482c-9dea-06ada6534e6c-kube-api-access-b2dsg\") pod \"dns-operator-744455d44c-fj4lh\" (UID: \"cec4d170-3bfa-482c-9dea-06ada6534e6c\") " pod="openshift-dns-operator/dns-operator-744455d44c-fj4lh" Feb 16 19:44:20 crc kubenswrapper[4675]: I0216 19:44:20.396324 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhhs7\" (UniqueName: \"kubernetes.io/projected/f9b2f6cf-9260-43ee-a1bd-3d04b6e0b666-kube-api-access-qhhs7\") pod \"cluster-image-registry-operator-dc59b4c8b-5qzwb\" (UID: \"f9b2f6cf-9260-43ee-a1bd-3d04b6e0b666\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5qzwb" Feb 16 19:44:20 crc kubenswrapper[4675]: I0216 19:44:20.409280 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-chc9w" Feb 16 19:44:20 crc kubenswrapper[4675]: I0216 19:44:20.420114 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-q894z"] Feb 16 19:44:20 crc kubenswrapper[4675]: I0216 19:44:20.421110 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 16 19:44:20 crc kubenswrapper[4675]: I0216 19:44:20.427451 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfk4g\" (UniqueName: \"kubernetes.io/projected/13c31ee7-d6e5-4796-b0f9-22111647def3-kube-api-access-pfk4g\") pod \"cluster-samples-operator-665b6dd947-7kckm\" (UID: \"13c31ee7-d6e5-4796-b0f9-22111647def3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7kckm" Feb 16 19:44:20 crc kubenswrapper[4675]: I0216 19:44:20.440022 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 16 19:44:20 crc kubenswrapper[4675]: I0216 19:44:20.460131 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 16 19:44:20 crc kubenswrapper[4675]: I0216 19:44:20.462360 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vpzz5"] Feb 16 19:44:20 crc kubenswrapper[4675]: W0216 19:44:20.466080 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd12d696_66f6_4a7c_b6a4_75659b2c1a3b.slice/crio-9d432ff03048aaa718838fc683cdb90202328d94a314935bc00bd1ceb6a5806a WatchSource:0}: Error finding container 9d432ff03048aaa718838fc683cdb90202328d94a314935bc00bd1ceb6a5806a: Status 404 returned error can't find the container with id 9d432ff03048aaa718838fc683cdb90202328d94a314935bc00bd1ceb6a5806a Feb 16 19:44:20 crc kubenswrapper[4675]: I0216 19:44:20.470772 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-sqwgl" Feb 16 19:44:20 crc kubenswrapper[4675]: I0216 19:44:20.472285 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9kz76" Feb 16 19:44:20 crc kubenswrapper[4675]: I0216 19:44:20.479645 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 16 19:44:20 crc kubenswrapper[4675]: I0216 19:44:20.479894 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6rbbh" Feb 16 19:44:20 crc kubenswrapper[4675]: I0216 19:44:20.500231 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 16 19:44:20 crc kubenswrapper[4675]: I0216 19:44:20.515543 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-dgnq8"] Feb 16 19:44:20 crc kubenswrapper[4675]: I0216 19:44:20.519980 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 16 19:44:20 crc kubenswrapper[4675]: I0216 19:44:20.526098 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-mgrxf" Feb 16 19:44:20 crc kubenswrapper[4675]: I0216 19:44:20.541339 4675 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 16 19:44:20 crc kubenswrapper[4675]: I0216 19:44:20.543176 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-fj4lh" Feb 16 19:44:20 crc kubenswrapper[4675]: W0216 19:44:20.549066 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7dc12320_ba2f_4683_b3fd_ed8cb9ef07c4.slice/crio-dc8c76d287ac74895cfd35a959a62ad2a27899e95974217a5518e8f2c0b9b836 WatchSource:0}: Error finding container dc8c76d287ac74895cfd35a959a62ad2a27899e95974217a5518e8f2c0b9b836: Status 404 returned error can't find the container with id dc8c76d287ac74895cfd35a959a62ad2a27899e95974217a5518e8f2c0b9b836 Feb 16 19:44:20 crc kubenswrapper[4675]: I0216 19:44:20.559362 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 16 19:44:20 crc kubenswrapper[4675]: I0216 19:44:20.576910 4675 request.go:700] Waited for 1.869929329s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/secrets?fieldSelector=metadata.name%3Dmachine-config-server-dockercfg-qx5rd&limit=500&resourceVersion=0 Feb 16 19:44:20 crc kubenswrapper[4675]: I0216 19:44:20.582206 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 16 19:44:20 crc kubenswrapper[4675]: I0216 19:44:20.606260 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7kckm" Feb 16 19:44:20 crc kubenswrapper[4675]: I0216 19:44:20.609511 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 16 19:44:20 crc kubenswrapper[4675]: I0216 19:44:20.621272 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dttlz" Feb 16 19:44:20 crc kubenswrapper[4675]: I0216 19:44:20.622205 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 16 19:44:20 crc kubenswrapper[4675]: I0216 19:44:20.655210 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5qzwb" Feb 16 19:44:20 crc kubenswrapper[4675]: I0216 19:44:20.668273 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-qzcjb"] Feb 16 19:44:20 crc kubenswrapper[4675]: I0216 19:44:20.686473 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kkpj\" (UniqueName: \"kubernetes.io/projected/b5a72330-c8e5-4be3-8083-ced47a7b6ada-kube-api-access-7kkpj\") pod \"oauth-openshift-558db77b4-55l96\" (UID: \"b5a72330-c8e5-4be3-8083-ced47a7b6ada\") " pod="openshift-authentication/oauth-openshift-558db77b4-55l96" Feb 16 19:44:20 crc kubenswrapper[4675]: I0216 19:44:20.705183 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktwfb\" (UniqueName: \"kubernetes.io/projected/82473df6-10c6-4c55-9eb0-c0a98830ff79-kube-api-access-ktwfb\") pod \"apiserver-7bbb656c7d-fg6bd\" (UID: \"82473df6-10c6-4c55-9eb0-c0a98830ff79\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fg6bd" Feb 16 19:44:20 crc kubenswrapper[4675]: I0216 19:44:20.707433 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-chc9w"] Feb 16 19:44:20 crc kubenswrapper[4675]: I0216 19:44:20.707799 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdrgx\" (UniqueName: \"kubernetes.io/projected/1b4bbcb2-7564-4018-9d41-2ecd0cc7a80f-kube-api-access-wdrgx\") pod \"machine-config-operator-74547568cd-hq42n\" (UID: \"1b4bbcb2-7564-4018-9d41-2ecd0cc7a80f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hq42n" Feb 16 19:44:20 crc kubenswrapper[4675]: I0216 19:44:20.736244 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2cd9132b-47d7-4a19-841b-cc409d55c5ae-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6rksb\" (UID: \"2cd9132b-47d7-4a19-841b-cc409d55c5ae\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6rksb" Feb 16 19:44:20 crc kubenswrapper[4675]: I0216 19:44:20.738042 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-dgnq8" event={"ID":"7dc12320-ba2f-4683-b3fd-ed8cb9ef07c4","Type":"ContainerStarted","Data":"dc8c76d287ac74895cfd35a959a62ad2a27899e95974217a5518e8f2c0b9b836"} Feb 16 19:44:20 crc kubenswrapper[4675]: I0216 19:44:20.740048 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gn7g\" (UniqueName: \"kubernetes.io/projected/da37a076-9ec6-4649-950a-db70d7802daf-kube-api-access-4gn7g\") pod \"openshift-apiserver-operator-796bbdcf4f-g82z9\" (UID: \"da37a076-9ec6-4649-950a-db70d7802daf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-g82z9" Feb 16 19:44:20 crc kubenswrapper[4675]: I0216 19:44:20.754965 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vpzz5" event={"ID":"fd12d696-66f6-4a7c-b6a4-75659b2c1a3b","Type":"ContainerStarted","Data":"9d432ff03048aaa718838fc683cdb90202328d94a314935bc00bd1ceb6a5806a"} Feb 16 19:44:20 crc kubenswrapper[4675]: I0216 19:44:20.758391 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmj4d\" (UniqueName: \"kubernetes.io/projected/76908270-f3ef-412b-a18d-065cf006461e-kube-api-access-rmj4d\") pod \"apiserver-76f77b778f-s7kmk\" (UID: \"76908270-f3ef-412b-a18d-065cf006461e\") " pod="openshift-apiserver/apiserver-76f77b778f-s7kmk" Feb 16 19:44:20 crc kubenswrapper[4675]: I0216 19:44:20.761903 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q894z" event={"ID":"f80b9c97-1cb5-427f-9b9f-4b62316d9e7a","Type":"ContainerStarted","Data":"023a978df3fd78c68a34a1312e8a2956396c8f897d22084c0ed520f1cc7a8b4c"} Feb 16 19:44:20 crc kubenswrapper[4675]: I0216 19:44:20.761949 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q894z" event={"ID":"f80b9c97-1cb5-427f-9b9f-4b62316d9e7a","Type":"ContainerStarted","Data":"eee48e579855f513be972a8235666d134e9b7a1feca7173d5fa3b7b923d0f79e"} Feb 16 19:44:20 crc kubenswrapper[4675]: I0216 19:44:20.763479 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q894z" Feb 16 19:44:20 crc kubenswrapper[4675]: I0216 19:44:20.766433 4675 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-q894z container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Feb 16 19:44:20 crc kubenswrapper[4675]: I0216 19:44:20.766478 4675 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q894z" podUID="f80b9c97-1cb5-427f-9b9f-4b62316d9e7a" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Feb 16 19:44:20 crc kubenswrapper[4675]: I0216 19:44:20.791826 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qkzf\" (UniqueName: \"kubernetes.io/projected/8c66cb83-e800-414f-a637-18fd6c6423e5-kube-api-access-9qkzf\") pod \"ingress-operator-5b745b69d9-ncsct\" (UID: \"8c66cb83-e800-414f-a637-18fd6c6423e5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ncsct" Feb 16 19:44:20 crc kubenswrapper[4675]: I0216 19:44:20.799295 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-6k7j5"] Feb 16 19:44:20 crc kubenswrapper[4675]: I0216 19:44:20.802319 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-s7kmk" Feb 16 19:44:20 crc kubenswrapper[4675]: I0216 19:44:20.803598 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/62c9795a-67ff-43f2-a303-fdc91be1494d-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-hksfk\" (UID: \"62c9795a-67ff-43f2-a303-fdc91be1494d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hksfk" Feb 16 19:44:20 crc kubenswrapper[4675]: I0216 19:44:20.809605 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fg6bd" Feb 16 19:44:20 crc kubenswrapper[4675]: I0216 19:44:20.841862 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqb9k\" (UniqueName: \"kubernetes.io/projected/e92c9592-4e6c-4589-809d-18de894b6352-kube-api-access-tqb9k\") pod \"kube-storage-version-migrator-operator-b67b599dd-4l4lr\" (UID: \"e92c9592-4e6c-4589-809d-18de894b6352\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4l4lr" Feb 16 19:44:20 crc kubenswrapper[4675]: I0216 19:44:20.845945 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-55l96" Feb 16 19:44:20 crc kubenswrapper[4675]: I0216 19:44:20.854881 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hq42n" Feb 16 19:44:20 crc kubenswrapper[4675]: I0216 19:44:20.859383 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8c66cb83-e800-414f-a637-18fd6c6423e5-bound-sa-token\") pod \"ingress-operator-5b745b69d9-ncsct\" (UID: \"8c66cb83-e800-414f-a637-18fd6c6423e5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ncsct" Feb 16 19:44:20 crc kubenswrapper[4675]: I0216 19:44:20.861901 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hksfk" Feb 16 19:44:20 crc kubenswrapper[4675]: I0216 19:44:20.870366 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6rksb" Feb 16 19:44:20 crc kubenswrapper[4675]: I0216 19:44:20.877678 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-g82z9" Feb 16 19:44:20 crc kubenswrapper[4675]: I0216 19:44:20.884490 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-6rbbh"] Feb 16 19:44:20 crc kubenswrapper[4675]: I0216 19:44:20.888328 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 16 19:44:20 crc kubenswrapper[4675]: I0216 19:44:20.888946 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4l4lr" Feb 16 19:44:20 crc kubenswrapper[4675]: I0216 19:44:20.902279 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 16 19:44:20 crc kubenswrapper[4675]: I0216 19:44:20.903455 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6mgn\" (UniqueName: \"kubernetes.io/projected/9ed87087-af1e-4042-b3ba-000095c2b183-kube-api-access-c6mgn\") pod \"machine-config-controller-84d6567774-ggdkr\" (UID: \"9ed87087-af1e-4042-b3ba-000095c2b183\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ggdkr" Feb 16 19:44:20 crc kubenswrapper[4675]: I0216 19:44:20.920158 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 16 19:44:20 crc kubenswrapper[4675]: I0216 19:44:20.941260 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 16 19:44:20 crc kubenswrapper[4675]: I0216 19:44:20.975522 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-fj4lh"] Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.002713 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.006779 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-sqwgl"] Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.006828 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9kz76"] Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.008644 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-mgrxf"] Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.013121 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8wtv\" (UniqueName: \"kubernetes.io/projected/02491684-b05b-4878-b335-ba69ffbe08c9-kube-api-access-w8wtv\") pod \"control-plane-machine-set-operator-78cbb6b69f-846bd\" (UID: \"02491684-b05b-4878-b335-ba69ffbe08c9\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-846bd" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.017329 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5qzwb"] Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.027490 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p48bp\" (UniqueName: \"kubernetes.io/projected/8de617fc-9ec5-4fd7-81a3-d1c7621bf288-kube-api-access-p48bp\") pod \"machine-api-operator-5694c8668f-lj8jk\" (UID: \"8de617fc-9ec5-4fd7-81a3-d1c7621bf288\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lj8jk" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.063732 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzzc5\" (UniqueName: \"kubernetes.io/projected/2a8b7c95-1c8f-4de9-907c-34b0b0848b13-kube-api-access-rzzc5\") pod \"image-registry-697d97f7c8-ctjpc\" (UID: \"2a8b7c95-1c8f-4de9-907c-34b0b0848b13\") " pod="openshift-image-registry/image-registry-697d97f7c8-ctjpc" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.063800 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2a8b7c95-1c8f-4de9-907c-34b0b0848b13-ca-trust-extracted\") pod \"image-registry-697d97f7c8-ctjpc\" (UID: \"2a8b7c95-1c8f-4de9-907c-34b0b0848b13\") " pod="openshift-image-registry/image-registry-697d97f7c8-ctjpc" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.063826 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2a8b7c95-1c8f-4de9-907c-34b0b0848b13-trusted-ca\") pod \"image-registry-697d97f7c8-ctjpc\" (UID: \"2a8b7c95-1c8f-4de9-907c-34b0b0848b13\") " pod="openshift-image-registry/image-registry-697d97f7c8-ctjpc" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.063858 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2a8b7c95-1c8f-4de9-907c-34b0b0848b13-registry-certificates\") pod \"image-registry-697d97f7c8-ctjpc\" (UID: \"2a8b7c95-1c8f-4de9-907c-34b0b0848b13\") " pod="openshift-image-registry/image-registry-697d97f7c8-ctjpc" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.063886 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2a8b7c95-1c8f-4de9-907c-34b0b0848b13-registry-tls\") pod \"image-registry-697d97f7c8-ctjpc\" (UID: \"2a8b7c95-1c8f-4de9-907c-34b0b0848b13\") " pod="openshift-image-registry/image-registry-697d97f7c8-ctjpc" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.063951 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ctjpc\" (UID: \"2a8b7c95-1c8f-4de9-907c-34b0b0848b13\") " pod="openshift-image-registry/image-registry-697d97f7c8-ctjpc" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.063981 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2a8b7c95-1c8f-4de9-907c-34b0b0848b13-bound-sa-token\") pod \"image-registry-697d97f7c8-ctjpc\" (UID: \"2a8b7c95-1c8f-4de9-907c-34b0b0848b13\") " pod="openshift-image-registry/image-registry-697d97f7c8-ctjpc" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.064039 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2a8b7c95-1c8f-4de9-907c-34b0b0848b13-installation-pull-secrets\") pod \"image-registry-697d97f7c8-ctjpc\" (UID: \"2a8b7c95-1c8f-4de9-907c-34b0b0848b13\") " pod="openshift-image-registry/image-registry-697d97f7c8-ctjpc" Feb 16 19:44:21 crc kubenswrapper[4675]: E0216 19:44:21.064474 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 19:44:21.564460327 +0000 UTC m=+144.689749883 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ctjpc" (UID: "2a8b7c95-1c8f-4de9-907c-34b0b0848b13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.067292 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-dttlz"] Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.094159 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7kckm"] Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.128604 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ncsct" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.140214 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-846bd" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.165071 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 19:44:21 crc kubenswrapper[4675]: E0216 19:44:21.165371 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 19:44:21.665331313 +0000 UTC m=+144.790621019 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.165469 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8c3f1a44-2a72-4614-a4fb-979e49d99e2d-auth-proxy-config\") pod \"machine-approver-56656f9798-jzrxx\" (UID: \"8c3f1a44-2a72-4614-a4fb-979e49d99e2d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jzrxx" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.165510 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2jrh\" (UniqueName: \"kubernetes.io/projected/b3876426-ba16-40ed-a0f8-aeb0cdb9ae0f-kube-api-access-f2jrh\") pod \"csi-hostpathplugin-vjgs5\" (UID: \"b3876426-ba16-40ed-a0f8-aeb0cdb9ae0f\") " pod="hostpath-provisioner/csi-hostpathplugin-vjgs5" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.165621 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgxtl\" (UniqueName: \"kubernetes.io/projected/4157206a-b996-4d4c-8a5f-fc820bfaed06-kube-api-access-mgxtl\") pod \"olm-operator-6b444d44fb-dvv45\" (UID: \"4157206a-b996-4d4c-8a5f-fc820bfaed06\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dvv45" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.165665 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2d280f5d-e193-45f2-8422-fa0a4177c833-apiservice-cert\") pod \"packageserver-d55dfcdfc-p2j88\" (UID: \"2d280f5d-e193-45f2-8422-fa0a4177c833\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-p2j88" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.165813 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2a8b7c95-1c8f-4de9-907c-34b0b0848b13-ca-trust-extracted\") pod \"image-registry-697d97f7c8-ctjpc\" (UID: \"2a8b7c95-1c8f-4de9-907c-34b0b0848b13\") " pod="openshift-image-registry/image-registry-697d97f7c8-ctjpc" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.165880 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xtpz\" (UniqueName: \"kubernetes.io/projected/408894c5-c798-48ff-93ac-bc8ea114ee4a-kube-api-access-4xtpz\") pod \"marketplace-operator-79b997595-6kj4n\" (UID: \"408894c5-c798-48ff-93ac-bc8ea114ee4a\") " pod="openshift-marketplace/marketplace-operator-79b997595-6kj4n" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.165907 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/2d280f5d-e193-45f2-8422-fa0a4177c833-tmpfs\") pod \"packageserver-d55dfcdfc-p2j88\" (UID: \"2d280f5d-e193-45f2-8422-fa0a4177c833\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-p2j88" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.165952 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2a8b7c95-1c8f-4de9-907c-34b0b0848b13-trusted-ca\") pod \"image-registry-697d97f7c8-ctjpc\" (UID: \"2a8b7c95-1c8f-4de9-907c-34b0b0848b13\") " pod="openshift-image-registry/image-registry-697d97f7c8-ctjpc" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.166032 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6b1e44b2-cd12-41e7-9cbd-99509b1cdf84-config-volume\") pod \"dns-default-n8s5c\" (UID: \"6b1e44b2-cd12-41e7-9cbd-99509b1cdf84\") " pod="openshift-dns/dns-default-n8s5c" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.166079 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sn4q\" (UniqueName: \"kubernetes.io/projected/2d280f5d-e193-45f2-8422-fa0a4177c833-kube-api-access-5sn4q\") pod \"packageserver-d55dfcdfc-p2j88\" (UID: \"2d280f5d-e193-45f2-8422-fa0a4177c833\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-p2j88" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.166118 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9ssc\" (UniqueName: \"kubernetes.io/projected/a367d97c-eb93-4e8b-a42d-6696bb381617-kube-api-access-h9ssc\") pod \"service-ca-operator-777779d784-6sj8v\" (UID: \"a367d97c-eb93-4e8b-a42d-6696bb381617\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6sj8v" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.166270 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2a8b7c95-1c8f-4de9-907c-34b0b0848b13-registry-certificates\") pod \"image-registry-697d97f7c8-ctjpc\" (UID: \"2a8b7c95-1c8f-4de9-907c-34b0b0848b13\") " pod="openshift-image-registry/image-registry-697d97f7c8-ctjpc" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.167776 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2a8b7c95-1c8f-4de9-907c-34b0b0848b13-ca-trust-extracted\") pod \"image-registry-697d97f7c8-ctjpc\" (UID: \"2a8b7c95-1c8f-4de9-907c-34b0b0848b13\") " pod="openshift-image-registry/image-registry-697d97f7c8-ctjpc" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.166464 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/408894c5-c798-48ff-93ac-bc8ea114ee4a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-6kj4n\" (UID: \"408894c5-c798-48ff-93ac-bc8ea114ee4a\") " pod="openshift-marketplace/marketplace-operator-79b997595-6kj4n" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.168098 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6b1e44b2-cd12-41e7-9cbd-99509b1cdf84-metrics-tls\") pod \"dns-default-n8s5c\" (UID: \"6b1e44b2-cd12-41e7-9cbd-99509b1cdf84\") " pod="openshift-dns/dns-default-n8s5c" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.168141 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9llg\" (UniqueName: \"kubernetes.io/projected/6b1e44b2-cd12-41e7-9cbd-99509b1cdf84-kube-api-access-z9llg\") pod \"dns-default-n8s5c\" (UID: \"6b1e44b2-cd12-41e7-9cbd-99509b1cdf84\") " pod="openshift-dns/dns-default-n8s5c" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.168547 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/d87befe1-6ea0-4820-9038-1f60a094cf3f-node-bootstrap-token\") pod \"machine-config-server-88zb5\" (UID: \"d87befe1-6ea0-4820-9038-1f60a094cf3f\") " pod="openshift-machine-config-operator/machine-config-server-88zb5" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.168586 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b3876426-ba16-40ed-a0f8-aeb0cdb9ae0f-registration-dir\") pod \"csi-hostpathplugin-vjgs5\" (UID: \"b3876426-ba16-40ed-a0f8-aeb0cdb9ae0f\") " pod="hostpath-provisioner/csi-hostpathplugin-vjgs5" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.168609 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nd9kn\" (UniqueName: \"kubernetes.io/projected/f65c0c4b-f568-44de-8408-2724a7841b62-kube-api-access-nd9kn\") pod \"package-server-manager-789f6589d5-n5h4j\" (UID: \"f65c0c4b-f568-44de-8408-2724a7841b62\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-n5h4j" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.168612 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2a8b7c95-1c8f-4de9-907c-34b0b0848b13-registry-certificates\") pod \"image-registry-697d97f7c8-ctjpc\" (UID: \"2a8b7c95-1c8f-4de9-907c-34b0b0848b13\") " pod="openshift-image-registry/image-registry-697d97f7c8-ctjpc" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.168643 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftkjc\" (UniqueName: \"kubernetes.io/projected/7a536c24-ac8f-4add-96e8-064bbcd40ba6-kube-api-access-ftkjc\") pod \"collect-profiles-29521170-6rzxb\" (UID: \"7a536c24-ac8f-4add-96e8-064bbcd40ba6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521170-6rzxb" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.168869 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2a8b7c95-1c8f-4de9-907c-34b0b0848b13-registry-tls\") pod \"image-registry-697d97f7c8-ctjpc\" (UID: \"2a8b7c95-1c8f-4de9-907c-34b0b0848b13\") " pod="openshift-image-registry/image-registry-697d97f7c8-ctjpc" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.168922 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a367d97c-eb93-4e8b-a42d-6696bb381617-config\") pod \"service-ca-operator-777779d784-6sj8v\" (UID: \"a367d97c-eb93-4e8b-a42d-6696bb381617\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6sj8v" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.168947 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7ce9d25c-a4a1-4414-8ed3-97ace0e8a309-srv-cert\") pod \"catalog-operator-68c6474976-944wg\" (UID: \"7ce9d25c-a4a1-4414-8ed3-97ace0e8a309\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-944wg" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.168970 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/c45bcd4d-0bb4-4813-9eb9-5864e4f46bce-signing-cabundle\") pod \"service-ca-9c57cc56f-lsjl6\" (UID: \"c45bcd4d-0bb4-4813-9eb9-5864e4f46bce\") " pod="openshift-service-ca/service-ca-9c57cc56f-lsjl6" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.169192 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7ce9d25c-a4a1-4414-8ed3-97ace0e8a309-profile-collector-cert\") pod \"catalog-operator-68c6474976-944wg\" (UID: \"7ce9d25c-a4a1-4414-8ed3-97ace0e8a309\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-944wg" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.169583 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/b3876426-ba16-40ed-a0f8-aeb0cdb9ae0f-plugins-dir\") pod \"csi-hostpathplugin-vjgs5\" (UID: \"b3876426-ba16-40ed-a0f8-aeb0cdb9ae0f\") " pod="hostpath-provisioner/csi-hostpathplugin-vjgs5" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.169630 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/8c3f1a44-2a72-4614-a4fb-979e49d99e2d-machine-approver-tls\") pod \"machine-approver-56656f9798-jzrxx\" (UID: \"8c3f1a44-2a72-4614-a4fb-979e49d99e2d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jzrxx" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.169956 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgskr\" (UniqueName: \"kubernetes.io/projected/7ce9d25c-a4a1-4414-8ed3-97ace0e8a309-kube-api-access-bgskr\") pod \"catalog-operator-68c6474976-944wg\" (UID: \"7ce9d25c-a4a1-4414-8ed3-97ace0e8a309\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-944wg" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.170309 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3ff2ffa0-19be-4c14-858d-0ecd570876fb-cert\") pod \"ingress-canary-7grkh\" (UID: \"3ff2ffa0-19be-4c14-858d-0ecd570876fb\") " pod="openshift-ingress-canary/ingress-canary-7grkh" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.170353 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzt65\" (UniqueName: \"kubernetes.io/projected/c45bcd4d-0bb4-4813-9eb9-5864e4f46bce-kube-api-access-tzt65\") pod \"service-ca-9c57cc56f-lsjl6\" (UID: \"c45bcd4d-0bb4-4813-9eb9-5864e4f46bce\") " pod="openshift-service-ca/service-ca-9c57cc56f-lsjl6" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.170491 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/b3876426-ba16-40ed-a0f8-aeb0cdb9ae0f-csi-data-dir\") pod \"csi-hostpathplugin-vjgs5\" (UID: \"b3876426-ba16-40ed-a0f8-aeb0cdb9ae0f\") " pod="hostpath-provisioner/csi-hostpathplugin-vjgs5" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.170589 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ctjpc\" (UID: \"2a8b7c95-1c8f-4de9-907c-34b0b0848b13\") " pod="openshift-image-registry/image-registry-697d97f7c8-ctjpc" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.170714 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2a8b7c95-1c8f-4de9-907c-34b0b0848b13-trusted-ca\") pod \"image-registry-697d97f7c8-ctjpc\" (UID: \"2a8b7c95-1c8f-4de9-907c-34b0b0848b13\") " pod="openshift-image-registry/image-registry-697d97f7c8-ctjpc" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.170911 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e02c5612-7f04-4ff9-902e-45c4436e01ed-metrics-certs\") pod \"router-default-5444994796-hrp82\" (UID: \"e02c5612-7f04-4ff9-902e-45c4436e01ed\") " pod="openshift-ingress/router-default-5444994796-hrp82" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.170988 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/b3876426-ba16-40ed-a0f8-aeb0cdb9ae0f-mountpoint-dir\") pod \"csi-hostpathplugin-vjgs5\" (UID: \"b3876426-ba16-40ed-a0f8-aeb0cdb9ae0f\") " pod="hostpath-provisioner/csi-hostpathplugin-vjgs5" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.171103 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2a8b7c95-1c8f-4de9-907c-34b0b0848b13-bound-sa-token\") pod \"image-registry-697d97f7c8-ctjpc\" (UID: \"2a8b7c95-1c8f-4de9-907c-34b0b0848b13\") " pod="openshift-image-registry/image-registry-697d97f7c8-ctjpc" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.171140 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/c45bcd4d-0bb4-4813-9eb9-5864e4f46bce-signing-key\") pod \"service-ca-9c57cc56f-lsjl6\" (UID: \"c45bcd4d-0bb4-4813-9eb9-5864e4f46bce\") " pod="openshift-service-ca/service-ca-9c57cc56f-lsjl6" Feb 16 19:44:21 crc kubenswrapper[4675]: E0216 19:44:21.171219 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 19:44:21.671201198 +0000 UTC m=+144.796490754 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ctjpc" (UID: "2a8b7c95-1c8f-4de9-907c-34b0b0848b13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.171557 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/408894c5-c798-48ff-93ac-bc8ea114ee4a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-6kj4n\" (UID: \"408894c5-c798-48ff-93ac-bc8ea114ee4a\") " pod="openshift-marketplace/marketplace-operator-79b997595-6kj4n" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.171910 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jq9c2\" (UniqueName: \"kubernetes.io/projected/3ff2ffa0-19be-4c14-858d-0ecd570876fb-kube-api-access-jq9c2\") pod \"ingress-canary-7grkh\" (UID: \"3ff2ffa0-19be-4c14-858d-0ecd570876fb\") " pod="openshift-ingress-canary/ingress-canary-7grkh" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.172298 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbjkt\" (UniqueName: \"kubernetes.io/projected/e02c5612-7f04-4ff9-902e-45c4436e01ed-kube-api-access-lbjkt\") pod \"router-default-5444994796-hrp82\" (UID: \"e02c5612-7f04-4ff9-902e-45c4436e01ed\") " pod="openshift-ingress/router-default-5444994796-hrp82" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.172752 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvt6l\" (UniqueName: \"kubernetes.io/projected/d87befe1-6ea0-4820-9038-1f60a094cf3f-kube-api-access-vvt6l\") pod \"machine-config-server-88zb5\" (UID: \"d87befe1-6ea0-4820-9038-1f60a094cf3f\") " pod="openshift-machine-config-operator/machine-config-server-88zb5" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.172791 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7a536c24-ac8f-4add-96e8-064bbcd40ba6-config-volume\") pod \"collect-profiles-29521170-6rzxb\" (UID: \"7a536c24-ac8f-4add-96e8-064bbcd40ba6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521170-6rzxb" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.172837 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e02c5612-7f04-4ff9-902e-45c4436e01ed-stats-auth\") pod \"router-default-5444994796-hrp82\" (UID: \"e02c5612-7f04-4ff9-902e-45c4436e01ed\") " pod="openshift-ingress/router-default-5444994796-hrp82" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.173136 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e02c5612-7f04-4ff9-902e-45c4436e01ed-service-ca-bundle\") pod \"router-default-5444994796-hrp82\" (UID: \"e02c5612-7f04-4ff9-902e-45c4436e01ed\") " pod="openshift-ingress/router-default-5444994796-hrp82" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.173191 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c40cd259-a202-462a-95de-34b6eb92c90c-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-86tqb\" (UID: \"c40cd259-a202-462a-95de-34b6eb92c90c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-86tqb" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.173218 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zqn4\" (UniqueName: \"kubernetes.io/projected/c40cd259-a202-462a-95de-34b6eb92c90c-kube-api-access-5zqn4\") pod \"multus-admission-controller-857f4d67dd-86tqb\" (UID: \"c40cd259-a202-462a-95de-34b6eb92c90c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-86tqb" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.173239 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b3876426-ba16-40ed-a0f8-aeb0cdb9ae0f-socket-dir\") pod \"csi-hostpathplugin-vjgs5\" (UID: \"b3876426-ba16-40ed-a0f8-aeb0cdb9ae0f\") " pod="hostpath-provisioner/csi-hostpathplugin-vjgs5" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.173678 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a367d97c-eb93-4e8b-a42d-6696bb381617-serving-cert\") pod \"service-ca-operator-777779d784-6sj8v\" (UID: \"a367d97c-eb93-4e8b-a42d-6696bb381617\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6sj8v" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.173730 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2a8b7c95-1c8f-4de9-907c-34b0b0848b13-installation-pull-secrets\") pod \"image-registry-697d97f7c8-ctjpc\" (UID: \"2a8b7c95-1c8f-4de9-907c-34b0b0848b13\") " pod="openshift-image-registry/image-registry-697d97f7c8-ctjpc" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.173805 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c3f1a44-2a72-4614-a4fb-979e49d99e2d-config\") pod \"machine-approver-56656f9798-jzrxx\" (UID: \"8c3f1a44-2a72-4614-a4fb-979e49d99e2d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jzrxx" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.173845 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/f65c0c4b-f568-44de-8408-2724a7841b62-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-n5h4j\" (UID: \"f65c0c4b-f568-44de-8408-2724a7841b62\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-n5h4j" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.173881 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e02c5612-7f04-4ff9-902e-45c4436e01ed-default-certificate\") pod \"router-default-5444994796-hrp82\" (UID: \"e02c5612-7f04-4ff9-902e-45c4436e01ed\") " pod="openshift-ingress/router-default-5444994796-hrp82" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.173927 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8s49\" (UniqueName: \"kubernetes.io/projected/8c3f1a44-2a72-4614-a4fb-979e49d99e2d-kube-api-access-r8s49\") pod \"machine-approver-56656f9798-jzrxx\" (UID: \"8c3f1a44-2a72-4614-a4fb-979e49d99e2d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jzrxx" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.174046 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7a536c24-ac8f-4add-96e8-064bbcd40ba6-secret-volume\") pod \"collect-profiles-29521170-6rzxb\" (UID: \"7a536c24-ac8f-4add-96e8-064bbcd40ba6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521170-6rzxb" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.174084 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2d280f5d-e193-45f2-8422-fa0a4177c833-webhook-cert\") pod \"packageserver-d55dfcdfc-p2j88\" (UID: \"2d280f5d-e193-45f2-8422-fa0a4177c833\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-p2j88" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.174136 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzzc5\" (UniqueName: \"kubernetes.io/projected/2a8b7c95-1c8f-4de9-907c-34b0b0848b13-kube-api-access-rzzc5\") pod \"image-registry-697d97f7c8-ctjpc\" (UID: \"2a8b7c95-1c8f-4de9-907c-34b0b0848b13\") " pod="openshift-image-registry/image-registry-697d97f7c8-ctjpc" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.174170 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4157206a-b996-4d4c-8a5f-fc820bfaed06-profile-collector-cert\") pod \"olm-operator-6b444d44fb-dvv45\" (UID: \"4157206a-b996-4d4c-8a5f-fc820bfaed06\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dvv45" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.174195 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/d87befe1-6ea0-4820-9038-1f60a094cf3f-certs\") pod \"machine-config-server-88zb5\" (UID: \"d87befe1-6ea0-4820-9038-1f60a094cf3f\") " pod="openshift-machine-config-operator/machine-config-server-88zb5" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.175153 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4157206a-b996-4d4c-8a5f-fc820bfaed06-srv-cert\") pod \"olm-operator-6b444d44fb-dvv45\" (UID: \"4157206a-b996-4d4c-8a5f-fc820bfaed06\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dvv45" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.176181 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-lj8jk" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.186030 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2a8b7c95-1c8f-4de9-907c-34b0b0848b13-installation-pull-secrets\") pod \"image-registry-697d97f7c8-ctjpc\" (UID: \"2a8b7c95-1c8f-4de9-907c-34b0b0848b13\") " pod="openshift-image-registry/image-registry-697d97f7c8-ctjpc" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.187258 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2a8b7c95-1c8f-4de9-907c-34b0b0848b13-registry-tls\") pod \"image-registry-697d97f7c8-ctjpc\" (UID: \"2a8b7c95-1c8f-4de9-907c-34b0b0848b13\") " pod="openshift-image-registry/image-registry-697d97f7c8-ctjpc" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.200289 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ggdkr" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.217133 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-fg6bd"] Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.257108 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzzc5\" (UniqueName: \"kubernetes.io/projected/2a8b7c95-1c8f-4de9-907c-34b0b0848b13-kube-api-access-rzzc5\") pod \"image-registry-697d97f7c8-ctjpc\" (UID: \"2a8b7c95-1c8f-4de9-907c-34b0b0848b13\") " pod="openshift-image-registry/image-registry-697d97f7c8-ctjpc" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.260243 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2a8b7c95-1c8f-4de9-907c-34b0b0848b13-bound-sa-token\") pod \"image-registry-697d97f7c8-ctjpc\" (UID: \"2a8b7c95-1c8f-4de9-907c-34b0b0848b13\") " pod="openshift-image-registry/image-registry-697d97f7c8-ctjpc" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.277403 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.277719 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6b1e44b2-cd12-41e7-9cbd-99509b1cdf84-config-volume\") pod \"dns-default-n8s5c\" (UID: \"6b1e44b2-cd12-41e7-9cbd-99509b1cdf84\") " pod="openshift-dns/dns-default-n8s5c" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.277747 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5sn4q\" (UniqueName: \"kubernetes.io/projected/2d280f5d-e193-45f2-8422-fa0a4177c833-kube-api-access-5sn4q\") pod \"packageserver-d55dfcdfc-p2j88\" (UID: \"2d280f5d-e193-45f2-8422-fa0a4177c833\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-p2j88" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.277769 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9ssc\" (UniqueName: \"kubernetes.io/projected/a367d97c-eb93-4e8b-a42d-6696bb381617-kube-api-access-h9ssc\") pod \"service-ca-operator-777779d784-6sj8v\" (UID: \"a367d97c-eb93-4e8b-a42d-6696bb381617\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6sj8v" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.277799 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/408894c5-c798-48ff-93ac-bc8ea114ee4a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-6kj4n\" (UID: \"408894c5-c798-48ff-93ac-bc8ea114ee4a\") " pod="openshift-marketplace/marketplace-operator-79b997595-6kj4n" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.277817 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6b1e44b2-cd12-41e7-9cbd-99509b1cdf84-metrics-tls\") pod \"dns-default-n8s5c\" (UID: \"6b1e44b2-cd12-41e7-9cbd-99509b1cdf84\") " pod="openshift-dns/dns-default-n8s5c" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.277834 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9llg\" (UniqueName: \"kubernetes.io/projected/6b1e44b2-cd12-41e7-9cbd-99509b1cdf84-kube-api-access-z9llg\") pod \"dns-default-n8s5c\" (UID: \"6b1e44b2-cd12-41e7-9cbd-99509b1cdf84\") " pod="openshift-dns/dns-default-n8s5c" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.277857 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/d87befe1-6ea0-4820-9038-1f60a094cf3f-node-bootstrap-token\") pod \"machine-config-server-88zb5\" (UID: \"d87befe1-6ea0-4820-9038-1f60a094cf3f\") " pod="openshift-machine-config-operator/machine-config-server-88zb5" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.277875 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b3876426-ba16-40ed-a0f8-aeb0cdb9ae0f-registration-dir\") pod \"csi-hostpathplugin-vjgs5\" (UID: \"b3876426-ba16-40ed-a0f8-aeb0cdb9ae0f\") " pod="hostpath-provisioner/csi-hostpathplugin-vjgs5" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.277892 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nd9kn\" (UniqueName: \"kubernetes.io/projected/f65c0c4b-f568-44de-8408-2724a7841b62-kube-api-access-nd9kn\") pod \"package-server-manager-789f6589d5-n5h4j\" (UID: \"f65c0c4b-f568-44de-8408-2724a7841b62\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-n5h4j" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.277911 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftkjc\" (UniqueName: \"kubernetes.io/projected/7a536c24-ac8f-4add-96e8-064bbcd40ba6-kube-api-access-ftkjc\") pod \"collect-profiles-29521170-6rzxb\" (UID: \"7a536c24-ac8f-4add-96e8-064bbcd40ba6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521170-6rzxb" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.277928 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a367d97c-eb93-4e8b-a42d-6696bb381617-config\") pod \"service-ca-operator-777779d784-6sj8v\" (UID: \"a367d97c-eb93-4e8b-a42d-6696bb381617\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6sj8v" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.277947 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7ce9d25c-a4a1-4414-8ed3-97ace0e8a309-srv-cert\") pod \"catalog-operator-68c6474976-944wg\" (UID: \"7ce9d25c-a4a1-4414-8ed3-97ace0e8a309\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-944wg" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.277974 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/c45bcd4d-0bb4-4813-9eb9-5864e4f46bce-signing-cabundle\") pod \"service-ca-9c57cc56f-lsjl6\" (UID: \"c45bcd4d-0bb4-4813-9eb9-5864e4f46bce\") " pod="openshift-service-ca/service-ca-9c57cc56f-lsjl6" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.278005 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7ce9d25c-a4a1-4414-8ed3-97ace0e8a309-profile-collector-cert\") pod \"catalog-operator-68c6474976-944wg\" (UID: \"7ce9d25c-a4a1-4414-8ed3-97ace0e8a309\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-944wg" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.278030 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/b3876426-ba16-40ed-a0f8-aeb0cdb9ae0f-plugins-dir\") pod \"csi-hostpathplugin-vjgs5\" (UID: \"b3876426-ba16-40ed-a0f8-aeb0cdb9ae0f\") " pod="hostpath-provisioner/csi-hostpathplugin-vjgs5" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.278054 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/8c3f1a44-2a72-4614-a4fb-979e49d99e2d-machine-approver-tls\") pod \"machine-approver-56656f9798-jzrxx\" (UID: \"8c3f1a44-2a72-4614-a4fb-979e49d99e2d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jzrxx" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.278080 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgskr\" (UniqueName: \"kubernetes.io/projected/7ce9d25c-a4a1-4414-8ed3-97ace0e8a309-kube-api-access-bgskr\") pod \"catalog-operator-68c6474976-944wg\" (UID: \"7ce9d25c-a4a1-4414-8ed3-97ace0e8a309\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-944wg" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.278119 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3ff2ffa0-19be-4c14-858d-0ecd570876fb-cert\") pod \"ingress-canary-7grkh\" (UID: \"3ff2ffa0-19be-4c14-858d-0ecd570876fb\") " pod="openshift-ingress-canary/ingress-canary-7grkh" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.278141 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzt65\" (UniqueName: \"kubernetes.io/projected/c45bcd4d-0bb4-4813-9eb9-5864e4f46bce-kube-api-access-tzt65\") pod \"service-ca-9c57cc56f-lsjl6\" (UID: \"c45bcd4d-0bb4-4813-9eb9-5864e4f46bce\") " pod="openshift-service-ca/service-ca-9c57cc56f-lsjl6" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.278172 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/b3876426-ba16-40ed-a0f8-aeb0cdb9ae0f-csi-data-dir\") pod \"csi-hostpathplugin-vjgs5\" (UID: \"b3876426-ba16-40ed-a0f8-aeb0cdb9ae0f\") " pod="hostpath-provisioner/csi-hostpathplugin-vjgs5" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.278204 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e02c5612-7f04-4ff9-902e-45c4436e01ed-metrics-certs\") pod \"router-default-5444994796-hrp82\" (UID: \"e02c5612-7f04-4ff9-902e-45c4436e01ed\") " pod="openshift-ingress/router-default-5444994796-hrp82" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.278222 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/b3876426-ba16-40ed-a0f8-aeb0cdb9ae0f-mountpoint-dir\") pod \"csi-hostpathplugin-vjgs5\" (UID: \"b3876426-ba16-40ed-a0f8-aeb0cdb9ae0f\") " pod="hostpath-provisioner/csi-hostpathplugin-vjgs5" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.278245 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/c45bcd4d-0bb4-4813-9eb9-5864e4f46bce-signing-key\") pod \"service-ca-9c57cc56f-lsjl6\" (UID: \"c45bcd4d-0bb4-4813-9eb9-5864e4f46bce\") " pod="openshift-service-ca/service-ca-9c57cc56f-lsjl6" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.278270 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/408894c5-c798-48ff-93ac-bc8ea114ee4a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-6kj4n\" (UID: \"408894c5-c798-48ff-93ac-bc8ea114ee4a\") " pod="openshift-marketplace/marketplace-operator-79b997595-6kj4n" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.278292 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jq9c2\" (UniqueName: \"kubernetes.io/projected/3ff2ffa0-19be-4c14-858d-0ecd570876fb-kube-api-access-jq9c2\") pod \"ingress-canary-7grkh\" (UID: \"3ff2ffa0-19be-4c14-858d-0ecd570876fb\") " pod="openshift-ingress-canary/ingress-canary-7grkh" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.278325 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbjkt\" (UniqueName: \"kubernetes.io/projected/e02c5612-7f04-4ff9-902e-45c4436e01ed-kube-api-access-lbjkt\") pod \"router-default-5444994796-hrp82\" (UID: \"e02c5612-7f04-4ff9-902e-45c4436e01ed\") " pod="openshift-ingress/router-default-5444994796-hrp82" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.278355 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvt6l\" (UniqueName: \"kubernetes.io/projected/d87befe1-6ea0-4820-9038-1f60a094cf3f-kube-api-access-vvt6l\") pod \"machine-config-server-88zb5\" (UID: \"d87befe1-6ea0-4820-9038-1f60a094cf3f\") " pod="openshift-machine-config-operator/machine-config-server-88zb5" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.278379 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7a536c24-ac8f-4add-96e8-064bbcd40ba6-config-volume\") pod \"collect-profiles-29521170-6rzxb\" (UID: \"7a536c24-ac8f-4add-96e8-064bbcd40ba6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521170-6rzxb" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.278399 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e02c5612-7f04-4ff9-902e-45c4436e01ed-stats-auth\") pod \"router-default-5444994796-hrp82\" (UID: \"e02c5612-7f04-4ff9-902e-45c4436e01ed\") " pod="openshift-ingress/router-default-5444994796-hrp82" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.278422 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e02c5612-7f04-4ff9-902e-45c4436e01ed-service-ca-bundle\") pod \"router-default-5444994796-hrp82\" (UID: \"e02c5612-7f04-4ff9-902e-45c4436e01ed\") " pod="openshift-ingress/router-default-5444994796-hrp82" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.278444 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c40cd259-a202-462a-95de-34b6eb92c90c-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-86tqb\" (UID: \"c40cd259-a202-462a-95de-34b6eb92c90c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-86tqb" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.278464 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zqn4\" (UniqueName: \"kubernetes.io/projected/c40cd259-a202-462a-95de-34b6eb92c90c-kube-api-access-5zqn4\") pod \"multus-admission-controller-857f4d67dd-86tqb\" (UID: \"c40cd259-a202-462a-95de-34b6eb92c90c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-86tqb" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.278483 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b3876426-ba16-40ed-a0f8-aeb0cdb9ae0f-socket-dir\") pod \"csi-hostpathplugin-vjgs5\" (UID: \"b3876426-ba16-40ed-a0f8-aeb0cdb9ae0f\") " pod="hostpath-provisioner/csi-hostpathplugin-vjgs5" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.278501 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a367d97c-eb93-4e8b-a42d-6696bb381617-serving-cert\") pod \"service-ca-operator-777779d784-6sj8v\" (UID: \"a367d97c-eb93-4e8b-a42d-6696bb381617\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6sj8v" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.278522 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c3f1a44-2a72-4614-a4fb-979e49d99e2d-config\") pod \"machine-approver-56656f9798-jzrxx\" (UID: \"8c3f1a44-2a72-4614-a4fb-979e49d99e2d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jzrxx" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.278544 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/f65c0c4b-f568-44de-8408-2724a7841b62-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-n5h4j\" (UID: \"f65c0c4b-f568-44de-8408-2724a7841b62\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-n5h4j" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.278566 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e02c5612-7f04-4ff9-902e-45c4436e01ed-default-certificate\") pod \"router-default-5444994796-hrp82\" (UID: \"e02c5612-7f04-4ff9-902e-45c4436e01ed\") " pod="openshift-ingress/router-default-5444994796-hrp82" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.278584 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8s49\" (UniqueName: \"kubernetes.io/projected/8c3f1a44-2a72-4614-a4fb-979e49d99e2d-kube-api-access-r8s49\") pod \"machine-approver-56656f9798-jzrxx\" (UID: \"8c3f1a44-2a72-4614-a4fb-979e49d99e2d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jzrxx" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.278605 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7a536c24-ac8f-4add-96e8-064bbcd40ba6-secret-volume\") pod \"collect-profiles-29521170-6rzxb\" (UID: \"7a536c24-ac8f-4add-96e8-064bbcd40ba6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521170-6rzxb" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.278626 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2d280f5d-e193-45f2-8422-fa0a4177c833-webhook-cert\") pod \"packageserver-d55dfcdfc-p2j88\" (UID: \"2d280f5d-e193-45f2-8422-fa0a4177c833\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-p2j88" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.278654 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4157206a-b996-4d4c-8a5f-fc820bfaed06-profile-collector-cert\") pod \"olm-operator-6b444d44fb-dvv45\" (UID: \"4157206a-b996-4d4c-8a5f-fc820bfaed06\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dvv45" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.278670 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/d87befe1-6ea0-4820-9038-1f60a094cf3f-certs\") pod \"machine-config-server-88zb5\" (UID: \"d87befe1-6ea0-4820-9038-1f60a094cf3f\") " pod="openshift-machine-config-operator/machine-config-server-88zb5" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.278829 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4157206a-b996-4d4c-8a5f-fc820bfaed06-srv-cert\") pod \"olm-operator-6b444d44fb-dvv45\" (UID: \"4157206a-b996-4d4c-8a5f-fc820bfaed06\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dvv45" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.278855 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8c3f1a44-2a72-4614-a4fb-979e49d99e2d-auth-proxy-config\") pod \"machine-approver-56656f9798-jzrxx\" (UID: \"8c3f1a44-2a72-4614-a4fb-979e49d99e2d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jzrxx" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.278880 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2jrh\" (UniqueName: \"kubernetes.io/projected/b3876426-ba16-40ed-a0f8-aeb0cdb9ae0f-kube-api-access-f2jrh\") pod \"csi-hostpathplugin-vjgs5\" (UID: \"b3876426-ba16-40ed-a0f8-aeb0cdb9ae0f\") " pod="hostpath-provisioner/csi-hostpathplugin-vjgs5" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.278905 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgxtl\" (UniqueName: \"kubernetes.io/projected/4157206a-b996-4d4c-8a5f-fc820bfaed06-kube-api-access-mgxtl\") pod \"olm-operator-6b444d44fb-dvv45\" (UID: \"4157206a-b996-4d4c-8a5f-fc820bfaed06\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dvv45" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.278923 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2d280f5d-e193-45f2-8422-fa0a4177c833-apiservice-cert\") pod \"packageserver-d55dfcdfc-p2j88\" (UID: \"2d280f5d-e193-45f2-8422-fa0a4177c833\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-p2j88" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.278942 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xtpz\" (UniqueName: \"kubernetes.io/projected/408894c5-c798-48ff-93ac-bc8ea114ee4a-kube-api-access-4xtpz\") pod \"marketplace-operator-79b997595-6kj4n\" (UID: \"408894c5-c798-48ff-93ac-bc8ea114ee4a\") " pod="openshift-marketplace/marketplace-operator-79b997595-6kj4n" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.278962 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/2d280f5d-e193-45f2-8422-fa0a4177c833-tmpfs\") pod \"packageserver-d55dfcdfc-p2j88\" (UID: \"2d280f5d-e193-45f2-8422-fa0a4177c833\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-p2j88" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.279451 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/2d280f5d-e193-45f2-8422-fa0a4177c833-tmpfs\") pod \"packageserver-d55dfcdfc-p2j88\" (UID: \"2d280f5d-e193-45f2-8422-fa0a4177c833\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-p2j88" Feb 16 19:44:21 crc kubenswrapper[4675]: E0216 19:44:21.281891 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 19:44:21.781853551 +0000 UTC m=+144.907143107 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.282187 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/b3876426-ba16-40ed-a0f8-aeb0cdb9ae0f-plugins-dir\") pod \"csi-hostpathplugin-vjgs5\" (UID: \"b3876426-ba16-40ed-a0f8-aeb0cdb9ae0f\") " pod="hostpath-provisioner/csi-hostpathplugin-vjgs5" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.284601 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/b3876426-ba16-40ed-a0f8-aeb0cdb9ae0f-csi-data-dir\") pod \"csi-hostpathplugin-vjgs5\" (UID: \"b3876426-ba16-40ed-a0f8-aeb0cdb9ae0f\") " pod="hostpath-provisioner/csi-hostpathplugin-vjgs5" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.286565 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6b1e44b2-cd12-41e7-9cbd-99509b1cdf84-config-volume\") pod \"dns-default-n8s5c\" (UID: \"6b1e44b2-cd12-41e7-9cbd-99509b1cdf84\") " pod="openshift-dns/dns-default-n8s5c" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.290671 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/c45bcd4d-0bb4-4813-9eb9-5864e4f46bce-signing-cabundle\") pod \"service-ca-9c57cc56f-lsjl6\" (UID: \"c45bcd4d-0bb4-4813-9eb9-5864e4f46bce\") " pod="openshift-service-ca/service-ca-9c57cc56f-lsjl6" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.291345 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6b1e44b2-cd12-41e7-9cbd-99509b1cdf84-metrics-tls\") pod \"dns-default-n8s5c\" (UID: \"6b1e44b2-cd12-41e7-9cbd-99509b1cdf84\") " pod="openshift-dns/dns-default-n8s5c" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.291359 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/408894c5-c798-48ff-93ac-bc8ea114ee4a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-6kj4n\" (UID: \"408894c5-c798-48ff-93ac-bc8ea114ee4a\") " pod="openshift-marketplace/marketplace-operator-79b997595-6kj4n" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.292356 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a367d97c-eb93-4e8b-a42d-6696bb381617-config\") pod \"service-ca-operator-777779d784-6sj8v\" (UID: \"a367d97c-eb93-4e8b-a42d-6696bb381617\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6sj8v" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.296338 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2d280f5d-e193-45f2-8422-fa0a4177c833-webhook-cert\") pod \"packageserver-d55dfcdfc-p2j88\" (UID: \"2d280f5d-e193-45f2-8422-fa0a4177c833\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-p2j88" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.297038 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b3876426-ba16-40ed-a0f8-aeb0cdb9ae0f-registration-dir\") pod \"csi-hostpathplugin-vjgs5\" (UID: \"b3876426-ba16-40ed-a0f8-aeb0cdb9ae0f\") " pod="hostpath-provisioner/csi-hostpathplugin-vjgs5" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.297771 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4157206a-b996-4d4c-8a5f-fc820bfaed06-profile-collector-cert\") pod \"olm-operator-6b444d44fb-dvv45\" (UID: \"4157206a-b996-4d4c-8a5f-fc820bfaed06\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dvv45" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.297976 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/b3876426-ba16-40ed-a0f8-aeb0cdb9ae0f-mountpoint-dir\") pod \"csi-hostpathplugin-vjgs5\" (UID: \"b3876426-ba16-40ed-a0f8-aeb0cdb9ae0f\") " pod="hostpath-provisioner/csi-hostpathplugin-vjgs5" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.298614 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8c3f1a44-2a72-4614-a4fb-979e49d99e2d-auth-proxy-config\") pod \"machine-approver-56656f9798-jzrxx\" (UID: \"8c3f1a44-2a72-4614-a4fb-979e49d99e2d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jzrxx" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.299248 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c3f1a44-2a72-4614-a4fb-979e49d99e2d-config\") pod \"machine-approver-56656f9798-jzrxx\" (UID: \"8c3f1a44-2a72-4614-a4fb-979e49d99e2d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jzrxx" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.299278 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e02c5612-7f04-4ff9-902e-45c4436e01ed-default-certificate\") pod \"router-default-5444994796-hrp82\" (UID: \"e02c5612-7f04-4ff9-902e-45c4436e01ed\") " pod="openshift-ingress/router-default-5444994796-hrp82" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.299926 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/c45bcd4d-0bb4-4813-9eb9-5864e4f46bce-signing-key\") pod \"service-ca-9c57cc56f-lsjl6\" (UID: \"c45bcd4d-0bb4-4813-9eb9-5864e4f46bce\") " pod="openshift-service-ca/service-ca-9c57cc56f-lsjl6" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.299932 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b3876426-ba16-40ed-a0f8-aeb0cdb9ae0f-socket-dir\") pod \"csi-hostpathplugin-vjgs5\" (UID: \"b3876426-ba16-40ed-a0f8-aeb0cdb9ae0f\") " pod="hostpath-provisioner/csi-hostpathplugin-vjgs5" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.301375 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4157206a-b996-4d4c-8a5f-fc820bfaed06-srv-cert\") pod \"olm-operator-6b444d44fb-dvv45\" (UID: \"4157206a-b996-4d4c-8a5f-fc820bfaed06\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dvv45" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.302096 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e02c5612-7f04-4ff9-902e-45c4436e01ed-service-ca-bundle\") pod \"router-default-5444994796-hrp82\" (UID: \"e02c5612-7f04-4ff9-902e-45c4436e01ed\") " pod="openshift-ingress/router-default-5444994796-hrp82" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.304839 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/8c3f1a44-2a72-4614-a4fb-979e49d99e2d-machine-approver-tls\") pod \"machine-approver-56656f9798-jzrxx\" (UID: \"8c3f1a44-2a72-4614-a4fb-979e49d99e2d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jzrxx" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.312535 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a367d97c-eb93-4e8b-a42d-6696bb381617-serving-cert\") pod \"service-ca-operator-777779d784-6sj8v\" (UID: \"a367d97c-eb93-4e8b-a42d-6696bb381617\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6sj8v" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.312784 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e02c5612-7f04-4ff9-902e-45c4436e01ed-stats-auth\") pod \"router-default-5444994796-hrp82\" (UID: \"e02c5612-7f04-4ff9-902e-45c4436e01ed\") " pod="openshift-ingress/router-default-5444994796-hrp82" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.312801 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/d87befe1-6ea0-4820-9038-1f60a094cf3f-certs\") pod \"machine-config-server-88zb5\" (UID: \"d87befe1-6ea0-4820-9038-1f60a094cf3f\") " pod="openshift-machine-config-operator/machine-config-server-88zb5" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.312834 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2d280f5d-e193-45f2-8422-fa0a4177c833-apiservice-cert\") pod \"packageserver-d55dfcdfc-p2j88\" (UID: \"2d280f5d-e193-45f2-8422-fa0a4177c833\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-p2j88" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.313073 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/f65c0c4b-f568-44de-8408-2724a7841b62-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-n5h4j\" (UID: \"f65c0c4b-f568-44de-8408-2724a7841b62\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-n5h4j" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.313248 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/408894c5-c798-48ff-93ac-bc8ea114ee4a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-6kj4n\" (UID: \"408894c5-c798-48ff-93ac-bc8ea114ee4a\") " pod="openshift-marketplace/marketplace-operator-79b997595-6kj4n" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.313339 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7ce9d25c-a4a1-4414-8ed3-97ace0e8a309-srv-cert\") pod \"catalog-operator-68c6474976-944wg\" (UID: \"7ce9d25c-a4a1-4414-8ed3-97ace0e8a309\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-944wg" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.313424 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7a536c24-ac8f-4add-96e8-064bbcd40ba6-config-volume\") pod \"collect-profiles-29521170-6rzxb\" (UID: \"7a536c24-ac8f-4add-96e8-064bbcd40ba6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521170-6rzxb" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.313534 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/d87befe1-6ea0-4820-9038-1f60a094cf3f-node-bootstrap-token\") pod \"machine-config-server-88zb5\" (UID: \"d87befe1-6ea0-4820-9038-1f60a094cf3f\") " pod="openshift-machine-config-operator/machine-config-server-88zb5" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.314191 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c40cd259-a202-462a-95de-34b6eb92c90c-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-86tqb\" (UID: \"c40cd259-a202-462a-95de-34b6eb92c90c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-86tqb" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.314262 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7a536c24-ac8f-4add-96e8-064bbcd40ba6-secret-volume\") pod \"collect-profiles-29521170-6rzxb\" (UID: \"7a536c24-ac8f-4add-96e8-064bbcd40ba6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521170-6rzxb" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.315316 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sn4q\" (UniqueName: \"kubernetes.io/projected/2d280f5d-e193-45f2-8422-fa0a4177c833-kube-api-access-5sn4q\") pod \"packageserver-d55dfcdfc-p2j88\" (UID: \"2d280f5d-e193-45f2-8422-fa0a4177c833\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-p2j88" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.323148 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7ce9d25c-a4a1-4414-8ed3-97ace0e8a309-profile-collector-cert\") pod \"catalog-operator-68c6474976-944wg\" (UID: \"7ce9d25c-a4a1-4414-8ed3-97ace0e8a309\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-944wg" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.328539 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e02c5612-7f04-4ff9-902e-45c4436e01ed-metrics-certs\") pod \"router-default-5444994796-hrp82\" (UID: \"e02c5612-7f04-4ff9-902e-45c4436e01ed\") " pod="openshift-ingress/router-default-5444994796-hrp82" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.339917 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3ff2ffa0-19be-4c14-858d-0ecd570876fb-cert\") pod \"ingress-canary-7grkh\" (UID: \"3ff2ffa0-19be-4c14-858d-0ecd570876fb\") " pod="openshift-ingress-canary/ingress-canary-7grkh" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.345659 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9ssc\" (UniqueName: \"kubernetes.io/projected/a367d97c-eb93-4e8b-a42d-6696bb381617-kube-api-access-h9ssc\") pod \"service-ca-operator-777779d784-6sj8v\" (UID: \"a367d97c-eb93-4e8b-a42d-6696bb381617\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6sj8v" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.352482 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-s7kmk"] Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.356155 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hksfk"] Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.381239 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ctjpc\" (UID: \"2a8b7c95-1c8f-4de9-907c-34b0b0848b13\") " pod="openshift-image-registry/image-registry-697d97f7c8-ctjpc" Feb 16 19:44:21 crc kubenswrapper[4675]: E0216 19:44:21.381711 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 19:44:21.881698801 +0000 UTC m=+145.006988357 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ctjpc" (UID: "2a8b7c95-1c8f-4de9-907c-34b0b0848b13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.384508 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbjkt\" (UniqueName: \"kubernetes.io/projected/e02c5612-7f04-4ff9-902e-45c4436e01ed-kube-api-access-lbjkt\") pod \"router-default-5444994796-hrp82\" (UID: \"e02c5612-7f04-4ff9-902e-45c4436e01ed\") " pod="openshift-ingress/router-default-5444994796-hrp82" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.387751 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftkjc\" (UniqueName: \"kubernetes.io/projected/7a536c24-ac8f-4add-96e8-064bbcd40ba6-kube-api-access-ftkjc\") pod \"collect-profiles-29521170-6rzxb\" (UID: \"7a536c24-ac8f-4add-96e8-064bbcd40ba6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521170-6rzxb" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.407962 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9llg\" (UniqueName: \"kubernetes.io/projected/6b1e44b2-cd12-41e7-9cbd-99509b1cdf84-kube-api-access-z9llg\") pod \"dns-default-n8s5c\" (UID: \"6b1e44b2-cd12-41e7-9cbd-99509b1cdf84\") " pod="openshift-dns/dns-default-n8s5c" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.422823 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jq9c2\" (UniqueName: \"kubernetes.io/projected/3ff2ffa0-19be-4c14-858d-0ecd570876fb-kube-api-access-jq9c2\") pod \"ingress-canary-7grkh\" (UID: \"3ff2ffa0-19be-4c14-858d-0ecd570876fb\") " pod="openshift-ingress-canary/ingress-canary-7grkh" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.423342 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6rksb"] Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.437373 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8s49\" (UniqueName: \"kubernetes.io/projected/8c3f1a44-2a72-4614-a4fb-979e49d99e2d-kube-api-access-r8s49\") pod \"machine-approver-56656f9798-jzrxx\" (UID: \"8c3f1a44-2a72-4614-a4fb-979e49d99e2d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jzrxx" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.462298 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nd9kn\" (UniqueName: \"kubernetes.io/projected/f65c0c4b-f568-44de-8408-2724a7841b62-kube-api-access-nd9kn\") pod \"package-server-manager-789f6589d5-n5h4j\" (UID: \"f65c0c4b-f568-44de-8408-2724a7841b62\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-n5h4j" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.479444 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgxtl\" (UniqueName: \"kubernetes.io/projected/4157206a-b996-4d4c-8a5f-fc820bfaed06-kube-api-access-mgxtl\") pod \"olm-operator-6b444d44fb-dvv45\" (UID: \"4157206a-b996-4d4c-8a5f-fc820bfaed06\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dvv45" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.483562 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 19:44:21 crc kubenswrapper[4675]: E0216 19:44:21.485399 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 19:44:21.984037225 +0000 UTC m=+145.109326781 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.505461 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-hrp82" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.512526 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zqn4\" (UniqueName: \"kubernetes.io/projected/c40cd259-a202-462a-95de-34b6eb92c90c-kube-api-access-5zqn4\") pod \"multus-admission-controller-857f4d67dd-86tqb\" (UID: \"c40cd259-a202-462a-95de-34b6eb92c90c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-86tqb" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.519458 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dvv45" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.527554 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jzrxx" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.528829 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgskr\" (UniqueName: \"kubernetes.io/projected/7ce9d25c-a4a1-4414-8ed3-97ace0e8a309-kube-api-access-bgskr\") pod \"catalog-operator-68c6474976-944wg\" (UID: \"7ce9d25c-a4a1-4414-8ed3-97ace0e8a309\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-944wg" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.542143 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2jrh\" (UniqueName: \"kubernetes.io/projected/b3876426-ba16-40ed-a0f8-aeb0cdb9ae0f-kube-api-access-f2jrh\") pod \"csi-hostpathplugin-vjgs5\" (UID: \"b3876426-ba16-40ed-a0f8-aeb0cdb9ae0f\") " pod="hostpath-provisioner/csi-hostpathplugin-vjgs5" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.547355 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-p2j88" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.552915 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-6sj8v" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.568128 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvt6l\" (UniqueName: \"kubernetes.io/projected/d87befe1-6ea0-4820-9038-1f60a094cf3f-kube-api-access-vvt6l\") pod \"machine-config-server-88zb5\" (UID: \"d87befe1-6ea0-4820-9038-1f60a094cf3f\") " pod="openshift-machine-config-operator/machine-config-server-88zb5" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.574189 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29521170-6rzxb" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.582742 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-n5h4j" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.583752 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzt65\" (UniqueName: \"kubernetes.io/projected/c45bcd4d-0bb4-4813-9eb9-5864e4f46bce-kube-api-access-tzt65\") pod \"service-ca-9c57cc56f-lsjl6\" (UID: \"c45bcd4d-0bb4-4813-9eb9-5864e4f46bce\") " pod="openshift-service-ca/service-ca-9c57cc56f-lsjl6" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.589502 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ctjpc\" (UID: \"2a8b7c95-1c8f-4de9-907c-34b0b0848b13\") " pod="openshift-image-registry/image-registry-697d97f7c8-ctjpc" Feb 16 19:44:21 crc kubenswrapper[4675]: E0216 19:44:21.589967 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 19:44:22.089951454 +0000 UTC m=+145.215241010 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ctjpc" (UID: "2a8b7c95-1c8f-4de9-907c-34b0b0848b13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.590252 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-944wg" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.600121 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-n8s5c" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.622121 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-vjgs5" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.632665 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-88zb5" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.634526 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-55l96"] Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.636345 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xtpz\" (UniqueName: \"kubernetes.io/projected/408894c5-c798-48ff-93ac-bc8ea114ee4a-kube-api-access-4xtpz\") pod \"marketplace-operator-79b997595-6kj4n\" (UID: \"408894c5-c798-48ff-93ac-bc8ea114ee4a\") " pod="openshift-marketplace/marketplace-operator-79b997595-6kj4n" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.640009 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-7grkh" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.670908 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-hq42n"] Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.697015 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 19:44:21 crc kubenswrapper[4675]: E0216 19:44:21.697966 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 19:44:22.197942638 +0000 UTC m=+145.323232194 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.707979 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-g82z9"] Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.790146 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4l4lr"] Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.800768 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ctjpc\" (UID: \"2a8b7c95-1c8f-4de9-907c-34b0b0848b13\") " pod="openshift-image-registry/image-registry-697d97f7c8-ctjpc" Feb 16 19:44:21 crc kubenswrapper[4675]: E0216 19:44:21.801325 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 19:44:22.301283959 +0000 UTC m=+145.426573505 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ctjpc" (UID: "2a8b7c95-1c8f-4de9-907c-34b0b0848b13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.810644 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-86tqb" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.840198 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-6kj4n" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.851211 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-qzcjb" event={"ID":"d9de00f8-6995-42d6-ad28-1961096e55c0","Type":"ContainerStarted","Data":"fc51b6e99af5497842f7ecf7064e700c700e56f017af21b26577194c6965e3d9"} Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.851533 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-qzcjb" event={"ID":"d9de00f8-6995-42d6-ad28-1961096e55c0","Type":"ContainerStarted","Data":"42be020f7cf6b1831d63f4be1772397f61d480b88c20294d7a73905ea93523e8"} Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.858547 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-dgnq8" event={"ID":"7dc12320-ba2f-4683-b3fd-ed8cb9ef07c4","Type":"ContainerStarted","Data":"bcff3aa888115a6149539a2bb003e371f9a4723c025782055d5766638d78869c"} Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.861825 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-fj4lh" event={"ID":"cec4d170-3bfa-482c-9dea-06ada6534e6c","Type":"ContainerStarted","Data":"4086ea969767519d9b2dd8feba39903a6bc532a70e004cf18140dca6b096cafe"} Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.866501 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-lsjl6" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.874667 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5qzwb" event={"ID":"f9b2f6cf-9260-43ee-a1bd-3d04b6e0b666","Type":"ContainerStarted","Data":"c0c1bcae85de3d8a6a406c7bc388557f0c027844ad36b46dacd334361d3b2732"} Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.883967 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6rksb" event={"ID":"2cd9132b-47d7-4a19-841b-cc409d55c5ae","Type":"ContainerStarted","Data":"554197c8fb3340de3605f1c7904936fa3ddae984c893fc26f7e31d07124dc9d9"} Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.900504 4675 patch_prober.go:28] interesting pod/console-operator-58897d9998-sqwgl container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.25:8443/readyz\": dial tcp 10.217.0.25:8443: connect: connection refused" start-of-body= Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.900560 4675 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-sqwgl" podUID="10890572-a59e-434e-9de6-cdc91e7ffa50" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.25:8443/readyz\": dial tcp 10.217.0.25:8443: connect: connection refused" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.902493 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 19:44:21 crc kubenswrapper[4675]: E0216 19:44:21.903792 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 19:44:22.403752288 +0000 UTC m=+145.529041844 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.936128 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-sqwgl" Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.936172 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-sqwgl" event={"ID":"10890572-a59e-434e-9de6-cdc91e7ffa50","Type":"ContainerStarted","Data":"819f6acdb91d3a4e7d0cf5bf727d06bb800048555fdcb3e784e29e595c328c18"} Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.936191 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-sqwgl" event={"ID":"10890572-a59e-434e-9de6-cdc91e7ffa50","Type":"ContainerStarted","Data":"c5721b092ed910acec33b95d8010cb70aace65f72429175fdd6449f13d62fc7a"} Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.936200 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-chc9w" event={"ID":"061097a9-32e9-4972-989a-f3777a41bc2b","Type":"ContainerStarted","Data":"b00d6c6295aa4b66f23d00ed3055b1b6e1b175d6a9b0e57edf0fbe2f93b80ab5"} Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.936212 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-chc9w" event={"ID":"061097a9-32e9-4972-989a-f3777a41bc2b","Type":"ContainerStarted","Data":"06ed7fe80a2ebe0ced893b8ed338e5b4e084cae4ca610449b68fa1e3778e12aa"} Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.945300 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vpzz5" event={"ID":"fd12d696-66f6-4a7c-b6a4-75659b2c1a3b","Type":"ContainerStarted","Data":"99cdb00d6a380b4bd63e88263b631d93863a19791af55d4c9ae7b1a25cfbdae9"} Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.953410 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hksfk" event={"ID":"62c9795a-67ff-43f2-a303-fdc91be1494d","Type":"ContainerStarted","Data":"be36f9b70ed39d692391ca54e29d913979c844060603020544496cc4c0267ac8"} Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.954863 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9kz76" event={"ID":"b2792f84-b979-482e-812c-00eadcc75958","Type":"ContainerStarted","Data":"072850f60f2e483daf209e581567510daf66bdf10d9a5c82005311e78b2adf2f"} Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.961885 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-s7kmk" event={"ID":"76908270-f3ef-412b-a18d-065cf006461e","Type":"ContainerStarted","Data":"3d71eaf832107dbcfb7602df6c99d5c06111b3200481ae8f2e0a46254fd47c4f"} Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.976496 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dttlz" event={"ID":"4178aeb7-6531-47e8-bf0d-695fbb18bc89","Type":"ContainerStarted","Data":"40252156a893521de56b701899178b0b79039276786b3cda7e3814a8bb43ac2c"} Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.978932 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7kckm" event={"ID":"13c31ee7-d6e5-4796-b0f9-22111647def3","Type":"ContainerStarted","Data":"ee8e4318d059a1e0c99bbe1650f97166f206f08761b572cab9e17d2d175951e8"} Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.995003 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6rbbh" event={"ID":"635463ec-8d07-4e48-973a-f219b530f144","Type":"ContainerStarted","Data":"a9794ce677a95de81b2d32fe7ee193df33e3988fc63b6096dba891b42c036e3b"} Feb 16 19:44:21 crc kubenswrapper[4675]: I0216 19:44:21.995074 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6rbbh" event={"ID":"635463ec-8d07-4e48-973a-f219b530f144","Type":"ContainerStarted","Data":"b5ed9a609b4f4b1c7fb36f97ecc428933c5134f08e6b51aad214dd3e9ccafc3e"} Feb 16 19:44:22 crc kubenswrapper[4675]: I0216 19:44:21.999012 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-mgrxf" event={"ID":"42669ce8-99eb-46ab-ac1e-ef126adaad60","Type":"ContainerStarted","Data":"5832857f6e4ee5f776fecf5771c34d0922aeb2a773e9fd92f82ff56774f59fef"} Feb 16 19:44:22 crc kubenswrapper[4675]: I0216 19:44:22.001885 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-6k7j5" event={"ID":"5d5f11f1-66e4-4700-9c37-f3843d1769a5","Type":"ContainerStarted","Data":"7fe56de1a673cb2ec0225543c42828c0571689017c03964e4e1a2e9c9b6985be"} Feb 16 19:44:22 crc kubenswrapper[4675]: I0216 19:44:22.001914 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-6k7j5" event={"ID":"5d5f11f1-66e4-4700-9c37-f3843d1769a5","Type":"ContainerStarted","Data":"72f0d1e8f05f00229101066823e0ac1f88cf58e44c3b33326982408b86ba542b"} Feb 16 19:44:22 crc kubenswrapper[4675]: I0216 19:44:22.003167 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-6k7j5" Feb 16 19:44:22 crc kubenswrapper[4675]: I0216 19:44:22.003890 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ctjpc\" (UID: \"2a8b7c95-1c8f-4de9-907c-34b0b0848b13\") " pod="openshift-image-registry/image-registry-697d97f7c8-ctjpc" Feb 16 19:44:22 crc kubenswrapper[4675]: E0216 19:44:22.008505 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 19:44:22.508483185 +0000 UTC m=+145.633772741 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ctjpc" (UID: "2a8b7c95-1c8f-4de9-907c-34b0b0848b13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:22 crc kubenswrapper[4675]: I0216 19:44:22.023785 4675 patch_prober.go:28] interesting pod/downloads-7954f5f757-6k7j5 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.33:8080/\": dial tcp 10.217.0.33:8080: connect: connection refused" start-of-body= Feb 16 19:44:22 crc kubenswrapper[4675]: I0216 19:44:22.024168 4675 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-6k7j5" podUID="5d5f11f1-66e4-4700-9c37-f3843d1769a5" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.33:8080/\": dial tcp 10.217.0.33:8080: connect: connection refused" Feb 16 19:44:22 crc kubenswrapper[4675]: I0216 19:44:22.029021 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fg6bd" event={"ID":"82473df6-10c6-4c55-9eb0-c0a98830ff79","Type":"ContainerStarted","Data":"35ecc419fc5a13c24b03d41002533d333e08dac0c337768115b7e93bfde0bc95"} Feb 16 19:44:22 crc kubenswrapper[4675]: I0216 19:44:22.049071 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q894z" Feb 16 19:44:22 crc kubenswrapper[4675]: I0216 19:44:22.053795 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q894z" podStartSLOduration=122.053765088 podStartE2EDuration="2m2.053765088s" podCreationTimestamp="2026-02-16 19:42:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 19:44:22.024856897 +0000 UTC m=+145.150146463" watchObservedRunningTime="2026-02-16 19:44:22.053765088 +0000 UTC m=+145.179054644" Feb 16 19:44:22 crc kubenswrapper[4675]: I0216 19:44:22.110677 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 19:44:22 crc kubenswrapper[4675]: E0216 19:44:22.111003 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 19:44:22.610972154 +0000 UTC m=+145.736261710 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:22 crc kubenswrapper[4675]: I0216 19:44:22.111915 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ctjpc\" (UID: \"2a8b7c95-1c8f-4de9-907c-34b0b0848b13\") " pod="openshift-image-registry/image-registry-697d97f7c8-ctjpc" Feb 16 19:44:22 crc kubenswrapper[4675]: E0216 19:44:22.118297 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 19:44:22.618274736 +0000 UTC m=+145.743564292 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ctjpc" (UID: "2a8b7c95-1c8f-4de9-907c-34b0b0848b13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:22 crc kubenswrapper[4675]: I0216 19:44:22.154013 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-846bd"] Feb 16 19:44:22 crc kubenswrapper[4675]: I0216 19:44:22.159246 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-ncsct"] Feb 16 19:44:22 crc kubenswrapper[4675]: I0216 19:44:22.219177 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 19:44:22 crc kubenswrapper[4675]: E0216 19:44:22.220147 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 19:44:22.720132678 +0000 UTC m=+145.845422234 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:22 crc kubenswrapper[4675]: I0216 19:44:22.300546 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-chc9w" podStartSLOduration=123.300525446 podStartE2EDuration="2m3.300525446s" podCreationTimestamp="2026-02-16 19:42:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 19:44:22.297596108 +0000 UTC m=+145.422885664" watchObservedRunningTime="2026-02-16 19:44:22.300525446 +0000 UTC m=+145.425815022" Feb 16 19:44:22 crc kubenswrapper[4675]: I0216 19:44:22.319747 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-ggdkr"] Feb 16 19:44:22 crc kubenswrapper[4675]: I0216 19:44:22.323088 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ctjpc\" (UID: \"2a8b7c95-1c8f-4de9-907c-34b0b0848b13\") " pod="openshift-image-registry/image-registry-697d97f7c8-ctjpc" Feb 16 19:44:22 crc kubenswrapper[4675]: E0216 19:44:22.323513 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 19:44:22.823496361 +0000 UTC m=+145.948785917 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ctjpc" (UID: "2a8b7c95-1c8f-4de9-907c-34b0b0848b13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:22 crc kubenswrapper[4675]: W0216 19:44:22.344930 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode02c5612_7f04_4ff9_902e_45c4436e01ed.slice/crio-cd0a7248364d09b5d4b6bb48e9bba1eef74bf158d522a1f1c43fc9dff72940df WatchSource:0}: Error finding container cd0a7248364d09b5d4b6bb48e9bba1eef74bf158d522a1f1c43fc9dff72940df: Status 404 returned error can't find the container with id cd0a7248364d09b5d4b6bb48e9bba1eef74bf158d522a1f1c43fc9dff72940df Feb 16 19:44:22 crc kubenswrapper[4675]: W0216 19:44:22.346630 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode92c9592_4e6c_4589_809d_18de894b6352.slice/crio-2db0b922e6aef06c7ad2c5793ebba3409d8286765bb812545cb89a07f4346912 WatchSource:0}: Error finding container 2db0b922e6aef06c7ad2c5793ebba3409d8286765bb812545cb89a07f4346912: Status 404 returned error can't find the container with id 2db0b922e6aef06c7ad2c5793ebba3409d8286765bb812545cb89a07f4346912 Feb 16 19:44:22 crc kubenswrapper[4675]: I0216 19:44:22.371495 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-lj8jk"] Feb 16 19:44:22 crc kubenswrapper[4675]: I0216 19:44:22.388082 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-dgnq8" podStartSLOduration=123.3880504 podStartE2EDuration="2m3.3880504s" podCreationTimestamp="2026-02-16 19:42:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 19:44:22.3792962 +0000 UTC m=+145.504585746" watchObservedRunningTime="2026-02-16 19:44:22.3880504 +0000 UTC m=+145.513339956" Feb 16 19:44:22 crc kubenswrapper[4675]: W0216 19:44:22.411329 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ed87087_af1e_4042_b3ba_000095c2b183.slice/crio-35bd9292bfbb40ace111002bbf596d1ec9367ac8feec05e98a344f5e58fca5ec WatchSource:0}: Error finding container 35bd9292bfbb40ace111002bbf596d1ec9367ac8feec05e98a344f5e58fca5ec: Status 404 returned error can't find the container with id 35bd9292bfbb40ace111002bbf596d1ec9367ac8feec05e98a344f5e58fca5ec Feb 16 19:44:22 crc kubenswrapper[4675]: I0216 19:44:22.422193 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-p2j88"] Feb 16 19:44:22 crc kubenswrapper[4675]: I0216 19:44:22.429558 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 19:44:22 crc kubenswrapper[4675]: E0216 19:44:22.430182 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 19:44:22.930158299 +0000 UTC m=+146.055447855 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:22 crc kubenswrapper[4675]: I0216 19:44:22.463743 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-qzcjb" podStartSLOduration=122.463647591 podStartE2EDuration="2m2.463647591s" podCreationTimestamp="2026-02-16 19:42:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 19:44:22.450881985 +0000 UTC m=+145.576171551" watchObservedRunningTime="2026-02-16 19:44:22.463647591 +0000 UTC m=+145.588937147" Feb 16 19:44:22 crc kubenswrapper[4675]: W0216 19:44:22.478885 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8de617fc_9ec5_4fd7_81a3_d1c7621bf288.slice/crio-4047bde9ec691b3c25fa3ea1d1f9ad01be18743a5643a97f9cf4c2d8a6e06f64 WatchSource:0}: Error finding container 4047bde9ec691b3c25fa3ea1d1f9ad01be18743a5643a97f9cf4c2d8a6e06f64: Status 404 returned error can't find the container with id 4047bde9ec691b3c25fa3ea1d1f9ad01be18743a5643a97f9cf4c2d8a6e06f64 Feb 16 19:44:22 crc kubenswrapper[4675]: I0216 19:44:22.531832 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ctjpc\" (UID: \"2a8b7c95-1c8f-4de9-907c-34b0b0848b13\") " pod="openshift-image-registry/image-registry-697d97f7c8-ctjpc" Feb 16 19:44:22 crc kubenswrapper[4675]: E0216 19:44:22.532279 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 19:44:23.032262398 +0000 UTC m=+146.157551954 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ctjpc" (UID: "2a8b7c95-1c8f-4de9-907c-34b0b0848b13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:22 crc kubenswrapper[4675]: I0216 19:44:22.632429 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 19:44:22 crc kubenswrapper[4675]: E0216 19:44:22.632931 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 19:44:23.132913828 +0000 UTC m=+146.258203384 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:22 crc kubenswrapper[4675]: I0216 19:44:22.728035 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-6sj8v"] Feb 16 19:44:22 crc kubenswrapper[4675]: I0216 19:44:22.741441 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ctjpc\" (UID: \"2a8b7c95-1c8f-4de9-907c-34b0b0848b13\") " pod="openshift-image-registry/image-registry-697d97f7c8-ctjpc" Feb 16 19:44:22 crc kubenswrapper[4675]: E0216 19:44:22.741922 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 19:44:23.241906478 +0000 UTC m=+146.367196034 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ctjpc" (UID: "2a8b7c95-1c8f-4de9-907c-34b0b0848b13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:22 crc kubenswrapper[4675]: I0216 19:44:22.799542 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-sqwgl" podStartSLOduration=123.799516065 podStartE2EDuration="2m3.799516065s" podCreationTimestamp="2026-02-16 19:42:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 19:44:22.796759973 +0000 UTC m=+145.922049529" watchObservedRunningTime="2026-02-16 19:44:22.799516065 +0000 UTC m=+145.924805641" Feb 16 19:44:22 crc kubenswrapper[4675]: I0216 19:44:22.844600 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 19:44:22 crc kubenswrapper[4675]: E0216 19:44:22.845345 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 19:44:23.345307731 +0000 UTC m=+146.470597287 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:22 crc kubenswrapper[4675]: I0216 19:44:22.900839 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9kz76" podStartSLOduration=122.900819153 podStartE2EDuration="2m2.900819153s" podCreationTimestamp="2026-02-16 19:42:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 19:44:22.860205563 +0000 UTC m=+145.985495119" watchObservedRunningTime="2026-02-16 19:44:22.900819153 +0000 UTC m=+146.026108709" Feb 16 19:44:22 crc kubenswrapper[4675]: I0216 19:44:22.948533 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ctjpc\" (UID: \"2a8b7c95-1c8f-4de9-907c-34b0b0848b13\") " pod="openshift-image-registry/image-registry-697d97f7c8-ctjpc" Feb 16 19:44:22 crc kubenswrapper[4675]: E0216 19:44:22.949008 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 19:44:23.448992712 +0000 UTC m=+146.574282268 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ctjpc" (UID: "2a8b7c95-1c8f-4de9-907c-34b0b0848b13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:22 crc kubenswrapper[4675]: I0216 19:44:22.976059 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-86tqb"] Feb 16 19:44:23 crc kubenswrapper[4675]: I0216 19:44:23.058498 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 19:44:23 crc kubenswrapper[4675]: E0216 19:44:23.059626 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 19:44:23.559595044 +0000 UTC m=+146.684884600 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:23 crc kubenswrapper[4675]: I0216 19:44:23.062095 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ctjpc\" (UID: \"2a8b7c95-1c8f-4de9-907c-34b0b0848b13\") " pod="openshift-image-registry/image-registry-697d97f7c8-ctjpc" Feb 16 19:44:23 crc kubenswrapper[4675]: E0216 19:44:23.062667 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 19:44:23.562653284 +0000 UTC m=+146.687942850 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ctjpc" (UID: "2a8b7c95-1c8f-4de9-907c-34b0b0848b13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:23 crc kubenswrapper[4675]: I0216 19:44:23.101030 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-lj8jk" event={"ID":"8de617fc-9ec5-4fd7-81a3-d1c7621bf288","Type":"ContainerStarted","Data":"4047bde9ec691b3c25fa3ea1d1f9ad01be18743a5643a97f9cf4c2d8a6e06f64"} Feb 16 19:44:23 crc kubenswrapper[4675]: I0216 19:44:23.146863 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-6sj8v" event={"ID":"a367d97c-eb93-4e8b-a42d-6696bb381617","Type":"ContainerStarted","Data":"7f5cea9890154093ff244a959d673b5a9bc73adc1f75a29b49a99de4b1986f97"} Feb 16 19:44:23 crc kubenswrapper[4675]: I0216 19:44:23.166890 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6rbbh" event={"ID":"635463ec-8d07-4e48-973a-f219b530f144","Type":"ContainerStarted","Data":"e34d989801b47fba2f13a56ae96fdf8d32844abd90a1ea016606377c37bf05d2"} Feb 16 19:44:23 crc kubenswrapper[4675]: I0216 19:44:23.171748 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 19:44:23 crc kubenswrapper[4675]: E0216 19:44:23.172396 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 19:44:23.672375714 +0000 UTC m=+146.797665280 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:23 crc kubenswrapper[4675]: I0216 19:44:23.229732 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-hrp82" event={"ID":"e02c5612-7f04-4ff9-902e-45c4436e01ed","Type":"ContainerStarted","Data":"cd0a7248364d09b5d4b6bb48e9bba1eef74bf158d522a1f1c43fc9dff72940df"} Feb 16 19:44:23 crc kubenswrapper[4675]: I0216 19:44:23.254592 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vpzz5" podStartSLOduration=123.254560678 podStartE2EDuration="2m3.254560678s" podCreationTimestamp="2026-02-16 19:42:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 19:44:23.205390493 +0000 UTC m=+146.330680059" watchObservedRunningTime="2026-02-16 19:44:23.254560678 +0000 UTC m=+146.379850224" Feb 16 19:44:23 crc kubenswrapper[4675]: I0216 19:44:23.275561 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ctjpc\" (UID: \"2a8b7c95-1c8f-4de9-907c-34b0b0848b13\") " pod="openshift-image-registry/image-registry-697d97f7c8-ctjpc" Feb 16 19:44:23 crc kubenswrapper[4675]: E0216 19:44:23.276400 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 19:44:23.776378693 +0000 UTC m=+146.901668239 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ctjpc" (UID: "2a8b7c95-1c8f-4de9-907c-34b0b0848b13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:23 crc kubenswrapper[4675]: I0216 19:44:23.281029 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-6k7j5" podStartSLOduration=124.281000394 podStartE2EDuration="2m4.281000394s" podCreationTimestamp="2026-02-16 19:42:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 19:44:23.280954973 +0000 UTC m=+146.406244529" watchObservedRunningTime="2026-02-16 19:44:23.281000394 +0000 UTC m=+146.406289950" Feb 16 19:44:23 crc kubenswrapper[4675]: I0216 19:44:23.343016 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ncsct" event={"ID":"8c66cb83-e800-414f-a637-18fd6c6423e5","Type":"ContainerStarted","Data":"89bd72aac5e1ca6d1b6f12eee509b9a889ff3c362a061d99105a4c658ea191be"} Feb 16 19:44:23 crc kubenswrapper[4675]: I0216 19:44:23.370987 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6rbbh" podStartSLOduration=123.370028039 podStartE2EDuration="2m3.370028039s" podCreationTimestamp="2026-02-16 19:42:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 19:44:23.355061214 +0000 UTC m=+146.480350780" watchObservedRunningTime="2026-02-16 19:44:23.370028039 +0000 UTC m=+146.495317615" Feb 16 19:44:23 crc kubenswrapper[4675]: I0216 19:44:23.371768 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9kz76" event={"ID":"b2792f84-b979-482e-812c-00eadcc75958","Type":"ContainerStarted","Data":"066243df4674b2f551be7249f43ec5c27c9439ce0b3111dff8b3b88cd89d02e4"} Feb 16 19:44:23 crc kubenswrapper[4675]: I0216 19:44:23.388893 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 19:44:23 crc kubenswrapper[4675]: E0216 19:44:23.389129 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 19:44:23.88908262 +0000 UTC m=+147.014372176 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:23 crc kubenswrapper[4675]: I0216 19:44:23.389342 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ctjpc\" (UID: \"2a8b7c95-1c8f-4de9-907c-34b0b0848b13\") " pod="openshift-image-registry/image-registry-697d97f7c8-ctjpc" Feb 16 19:44:23 crc kubenswrapper[4675]: E0216 19:44:23.391797 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 19:44:23.891776291 +0000 UTC m=+147.017065847 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ctjpc" (UID: "2a8b7c95-1c8f-4de9-907c-34b0b0848b13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:23 crc kubenswrapper[4675]: I0216 19:44:23.412944 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-88zb5" event={"ID":"d87befe1-6ea0-4820-9038-1f60a094cf3f","Type":"ContainerStarted","Data":"8d04395cbec2451394e96034bf26a4c8363de31f296bd6e9758242d512f91c8e"} Feb 16 19:44:23 crc kubenswrapper[4675]: I0216 19:44:23.456437 4675 generic.go:334] "Generic (PLEG): container finished" podID="4178aeb7-6531-47e8-bf0d-695fbb18bc89" containerID="65599aae19592af865749222c099677f0777bdc3b690c70018add6b81026a14a" exitCode=0 Feb 16 19:44:23 crc kubenswrapper[4675]: I0216 19:44:23.456508 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dttlz" event={"ID":"4178aeb7-6531-47e8-bf0d-695fbb18bc89","Type":"ContainerDied","Data":"65599aae19592af865749222c099677f0777bdc3b690c70018add6b81026a14a"} Feb 16 19:44:23 crc kubenswrapper[4675]: I0216 19:44:23.490385 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 19:44:23 crc kubenswrapper[4675]: E0216 19:44:23.491255 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 19:44:23.99123021 +0000 UTC m=+147.116519756 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:23 crc kubenswrapper[4675]: I0216 19:44:23.491675 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ctjpc\" (UID: \"2a8b7c95-1c8f-4de9-907c-34b0b0848b13\") " pod="openshift-image-registry/image-registry-697d97f7c8-ctjpc" Feb 16 19:44:23 crc kubenswrapper[4675]: E0216 19:44:23.492011 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 19:44:23.992002421 +0000 UTC m=+147.117291977 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ctjpc" (UID: "2a8b7c95-1c8f-4de9-907c-34b0b0848b13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:23 crc kubenswrapper[4675]: I0216 19:44:23.520313 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7kckm" event={"ID":"13c31ee7-d6e5-4796-b0f9-22111647def3","Type":"ContainerStarted","Data":"b2cd12ac7e66b7e62d3e3f2c60ed6059c0607c3ad2bbe77835df36d59596de39"} Feb 16 19:44:23 crc kubenswrapper[4675]: I0216 19:44:23.556889 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-fj4lh" event={"ID":"cec4d170-3bfa-482c-9dea-06ada6534e6c","Type":"ContainerStarted","Data":"2534a0399e6a72b4a6f19870a28dbdb38c41890a76817988c86784aa949b391c"} Feb 16 19:44:23 crc kubenswrapper[4675]: I0216 19:44:23.595288 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 19:44:23 crc kubenswrapper[4675]: E0216 19:44:23.595588 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 19:44:24.095563178 +0000 UTC m=+147.220852734 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:23 crc kubenswrapper[4675]: I0216 19:44:23.595703 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ctjpc\" (UID: \"2a8b7c95-1c8f-4de9-907c-34b0b0848b13\") " pod="openshift-image-registry/image-registry-697d97f7c8-ctjpc" Feb 16 19:44:23 crc kubenswrapper[4675]: I0216 19:44:23.596031 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jzrxx" event={"ID":"8c3f1a44-2a72-4614-a4fb-979e49d99e2d","Type":"ContainerStarted","Data":"ee44b8c7cc69e24dbb9ad51f4fde9aa7b380d50c0f959a4bca9d305c7ec6f108"} Feb 16 19:44:23 crc kubenswrapper[4675]: E0216 19:44:23.596173 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 19:44:24.096164783 +0000 UTC m=+147.221454339 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ctjpc" (UID: "2a8b7c95-1c8f-4de9-907c-34b0b0848b13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:23 crc kubenswrapper[4675]: I0216 19:44:23.612291 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-g82z9" event={"ID":"da37a076-9ec6-4649-950a-db70d7802daf","Type":"ContainerStarted","Data":"737f50ee5703662c129895bc0e78fe0cda3910e13b94d550cebf6d0e8fcd7f05"} Feb 16 19:44:23 crc kubenswrapper[4675]: I0216 19:44:23.631299 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-mgrxf" event={"ID":"42669ce8-99eb-46ab-ac1e-ef126adaad60","Type":"ContainerStarted","Data":"120aaed85c57101158f318fd1159e4f9896679b51ddaacba91dbc15c33da89c5"} Feb 16 19:44:23 crc kubenswrapper[4675]: I0216 19:44:23.657979 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-846bd" event={"ID":"02491684-b05b-4878-b335-ba69ffbe08c9","Type":"ContainerStarted","Data":"caa48e5598a6be48055bbbf44ca24b2f364eef5d0a97decec206be21476435c0"} Feb 16 19:44:23 crc kubenswrapper[4675]: I0216 19:44:23.669898 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-mgrxf" podStartSLOduration=123.669878694 podStartE2EDuration="2m3.669878694s" podCreationTimestamp="2026-02-16 19:42:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 19:44:23.669428173 +0000 UTC m=+146.794717739" watchObservedRunningTime="2026-02-16 19:44:23.669878694 +0000 UTC m=+146.795168250" Feb 16 19:44:23 crc kubenswrapper[4675]: I0216 19:44:23.672970 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-55l96" event={"ID":"b5a72330-c8e5-4be3-8083-ced47a7b6ada","Type":"ContainerStarted","Data":"2f2effe6edd1814211f494b314a63a0456fd25f540f453ec51b05d8962f43ed5"} Feb 16 19:44:23 crc kubenswrapper[4675]: I0216 19:44:23.689842 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4l4lr" event={"ID":"e92c9592-4e6c-4589-809d-18de894b6352","Type":"ContainerStarted","Data":"2db0b922e6aef06c7ad2c5793ebba3409d8286765bb812545cb89a07f4346912"} Feb 16 19:44:23 crc kubenswrapper[4675]: I0216 19:44:23.691125 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hq42n" event={"ID":"1b4bbcb2-7564-4018-9d41-2ecd0cc7a80f","Type":"ContainerStarted","Data":"8b93ae28d8fe9a32a49790b635dabffb6e519a9ed49ef7aa2299a7e83b7158df"} Feb 16 19:44:23 crc kubenswrapper[4675]: I0216 19:44:23.692131 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5qzwb" event={"ID":"f9b2f6cf-9260-43ee-a1bd-3d04b6e0b666","Type":"ContainerStarted","Data":"0bd6fd078f2a3fec75d824b0a73f198d773aad95e5b97baeab7dff3ad112a73d"} Feb 16 19:44:23 crc kubenswrapper[4675]: I0216 19:44:23.697248 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ggdkr" event={"ID":"9ed87087-af1e-4042-b3ba-000095c2b183","Type":"ContainerStarted","Data":"35bd9292bfbb40ace111002bbf596d1ec9367ac8feec05e98a344f5e58fca5ec"} Feb 16 19:44:23 crc kubenswrapper[4675]: I0216 19:44:23.708528 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 19:44:23 crc kubenswrapper[4675]: E0216 19:44:23.710004 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 19:44:24.209986711 +0000 UTC m=+147.335276277 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:23 crc kubenswrapper[4675]: I0216 19:44:23.717155 4675 generic.go:334] "Generic (PLEG): container finished" podID="82473df6-10c6-4c55-9eb0-c0a98830ff79" containerID="113913cd8f4f3e721176c98249020e4c8d0f559773e5377d94c9c1ea306d76a9" exitCode=0 Feb 16 19:44:23 crc kubenswrapper[4675]: I0216 19:44:23.717248 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fg6bd" event={"ID":"82473df6-10c6-4c55-9eb0-c0a98830ff79","Type":"ContainerDied","Data":"113913cd8f4f3e721176c98249020e4c8d0f559773e5377d94c9c1ea306d76a9"} Feb 16 19:44:23 crc kubenswrapper[4675]: I0216 19:44:23.728831 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-p2j88" event={"ID":"2d280f5d-e193-45f2-8422-fa0a4177c833","Type":"ContainerStarted","Data":"ca7ddf38da1dea09b17efa750e46b572ffba411e8e7fb04185b3c8084586c533"} Feb 16 19:44:23 crc kubenswrapper[4675]: I0216 19:44:23.728884 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-qzcjb" Feb 16 19:44:23 crc kubenswrapper[4675]: I0216 19:44:23.738488 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-qzcjb" Feb 16 19:44:23 crc kubenswrapper[4675]: I0216 19:44:23.748848 4675 patch_prober.go:28] interesting pod/downloads-7954f5f757-6k7j5 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.33:8080/\": dial tcp 10.217.0.33:8080: connect: connection refused" start-of-body= Feb 16 19:44:23 crc kubenswrapper[4675]: I0216 19:44:23.749115 4675 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-6k7j5" podUID="5d5f11f1-66e4-4700-9c37-f3843d1769a5" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.33:8080/\": dial tcp 10.217.0.33:8080: connect: connection refused" Feb 16 19:44:23 crc kubenswrapper[4675]: I0216 19:44:23.770375 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5qzwb" podStartSLOduration=123.77034236 podStartE2EDuration="2m3.77034236s" podCreationTimestamp="2026-02-16 19:42:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 19:44:23.729904175 +0000 UTC m=+146.855193731" watchObservedRunningTime="2026-02-16 19:44:23.77034236 +0000 UTC m=+146.895631916" Feb 16 19:44:23 crc kubenswrapper[4675]: I0216 19:44:23.818643 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ctjpc\" (UID: \"2a8b7c95-1c8f-4de9-907c-34b0b0848b13\") " pod="openshift-image-registry/image-registry-697d97f7c8-ctjpc" Feb 16 19:44:23 crc kubenswrapper[4675]: E0216 19:44:23.819934 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 19:44:24.319910775 +0000 UTC m=+147.445200331 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ctjpc" (UID: "2a8b7c95-1c8f-4de9-907c-34b0b0848b13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:23 crc kubenswrapper[4675]: I0216 19:44:23.864137 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29521170-6rzxb"] Feb 16 19:44:23 crc kubenswrapper[4675]: I0216 19:44:23.887586 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-lsjl6"] Feb 16 19:44:23 crc kubenswrapper[4675]: I0216 19:44:23.919822 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 19:44:23 crc kubenswrapper[4675]: E0216 19:44:23.920913 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 19:44:24.420894323 +0000 UTC m=+147.546183879 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:23 crc kubenswrapper[4675]: I0216 19:44:23.942145 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-vjgs5"] Feb 16 19:44:24 crc kubenswrapper[4675]: I0216 19:44:24.013217 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-7grkh"] Feb 16 19:44:24 crc kubenswrapper[4675]: I0216 19:44:24.022611 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ctjpc\" (UID: \"2a8b7c95-1c8f-4de9-907c-34b0b0848b13\") " pod="openshift-image-registry/image-registry-697d97f7c8-ctjpc" Feb 16 19:44:24 crc kubenswrapper[4675]: E0216 19:44:24.023145 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 19:44:24.523126175 +0000 UTC m=+147.648415731 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ctjpc" (UID: "2a8b7c95-1c8f-4de9-907c-34b0b0848b13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:24 crc kubenswrapper[4675]: I0216 19:44:24.037653 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dvv45"] Feb 16 19:44:24 crc kubenswrapper[4675]: I0216 19:44:24.108448 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-n8s5c"] Feb 16 19:44:24 crc kubenswrapper[4675]: I0216 19:44:24.109964 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-944wg"] Feb 16 19:44:24 crc kubenswrapper[4675]: I0216 19:44:24.125604 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 19:44:24 crc kubenswrapper[4675]: E0216 19:44:24.126068 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 19:44:24.626052976 +0000 UTC m=+147.751342532 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:24 crc kubenswrapper[4675]: I0216 19:44:24.226660 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ctjpc\" (UID: \"2a8b7c95-1c8f-4de9-907c-34b0b0848b13\") " pod="openshift-image-registry/image-registry-697d97f7c8-ctjpc" Feb 16 19:44:24 crc kubenswrapper[4675]: E0216 19:44:24.227346 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 19:44:24.727324572 +0000 UTC m=+147.852614128 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ctjpc" (UID: "2a8b7c95-1c8f-4de9-907c-34b0b0848b13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:24 crc kubenswrapper[4675]: I0216 19:44:24.310717 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-n5h4j"] Feb 16 19:44:24 crc kubenswrapper[4675]: I0216 19:44:24.328491 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 19:44:24 crc kubenswrapper[4675]: E0216 19:44:24.329243 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 19:44:24.829220796 +0000 UTC m=+147.954510352 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:24 crc kubenswrapper[4675]: I0216 19:44:24.431418 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ctjpc\" (UID: \"2a8b7c95-1c8f-4de9-907c-34b0b0848b13\") " pod="openshift-image-registry/image-registry-697d97f7c8-ctjpc" Feb 16 19:44:24 crc kubenswrapper[4675]: E0216 19:44:24.432323 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 19:44:24.93231012 +0000 UTC m=+148.057599676 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ctjpc" (UID: "2a8b7c95-1c8f-4de9-907c-34b0b0848b13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:24 crc kubenswrapper[4675]: I0216 19:44:24.516147 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6kj4n"] Feb 16 19:44:24 crc kubenswrapper[4675]: I0216 19:44:24.534305 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 19:44:24 crc kubenswrapper[4675]: E0216 19:44:24.534888 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 19:44:25.034865581 +0000 UTC m=+148.160155137 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:24 crc kubenswrapper[4675]: I0216 19:44:24.561552 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-sqwgl" Feb 16 19:44:24 crc kubenswrapper[4675]: W0216 19:44:24.621955 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod408894c5_c798_48ff_93ac_bc8ea114ee4a.slice/crio-87f352d61dab8b390f045f5835b03edb64f2e820328abd145ad4f940204f38a5 WatchSource:0}: Error finding container 87f352d61dab8b390f045f5835b03edb64f2e820328abd145ad4f940204f38a5: Status 404 returned error can't find the container with id 87f352d61dab8b390f045f5835b03edb64f2e820328abd145ad4f940204f38a5 Feb 16 19:44:24 crc kubenswrapper[4675]: I0216 19:44:24.639525 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ctjpc\" (UID: \"2a8b7c95-1c8f-4de9-907c-34b0b0848b13\") " pod="openshift-image-registry/image-registry-697d97f7c8-ctjpc" Feb 16 19:44:24 crc kubenswrapper[4675]: E0216 19:44:24.639923 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 19:44:25.139908467 +0000 UTC m=+148.265198013 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ctjpc" (UID: "2a8b7c95-1c8f-4de9-907c-34b0b0848b13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:24 crc kubenswrapper[4675]: I0216 19:44:24.746013 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 19:44:24 crc kubenswrapper[4675]: E0216 19:44:24.746321 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 19:44:25.246297938 +0000 UTC m=+148.371587494 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:24 crc kubenswrapper[4675]: I0216 19:44:24.746628 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ctjpc\" (UID: \"2a8b7c95-1c8f-4de9-907c-34b0b0848b13\") " pod="openshift-image-registry/image-registry-697d97f7c8-ctjpc" Feb 16 19:44:24 crc kubenswrapper[4675]: E0216 19:44:24.747210 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 19:44:25.247201112 +0000 UTC m=+148.372490668 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ctjpc" (UID: "2a8b7c95-1c8f-4de9-907c-34b0b0848b13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:24 crc kubenswrapper[4675]: I0216 19:44:24.766915 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-88zb5" event={"ID":"d87befe1-6ea0-4820-9038-1f60a094cf3f","Type":"ContainerStarted","Data":"92bf5b9ed3b6138546df6253fbc8bf5c6a8e918cf2d37d12b2f5e5fda271c7fc"} Feb 16 19:44:24 crc kubenswrapper[4675]: I0216 19:44:24.804397 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jzrxx" event={"ID":"8c3f1a44-2a72-4614-a4fb-979e49d99e2d","Type":"ContainerStarted","Data":"5fd9236d2abc270d35c13bc985dde505fd2c0089e0a1fffb0c4aa738171a457b"} Feb 16 19:44:24 crc kubenswrapper[4675]: I0216 19:44:24.823447 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4l4lr" event={"ID":"e92c9592-4e6c-4589-809d-18de894b6352","Type":"ContainerStarted","Data":"90cdc18974f341818a9def892dfae652b27a8c397d6ffdb126849ad3acefce89"} Feb 16 19:44:24 crc kubenswrapper[4675]: I0216 19:44:24.845487 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-g82z9" event={"ID":"da37a076-9ec6-4649-950a-db70d7802daf","Type":"ContainerStarted","Data":"d67e8a4963acae88e88132309b8268727e82ee6d998ed80da9c979820e7e8e25"} Feb 16 19:44:24 crc kubenswrapper[4675]: I0216 19:44:24.847486 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 19:44:24 crc kubenswrapper[4675]: E0216 19:44:24.847806 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 19:44:25.347778721 +0000 UTC m=+148.473068467 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:24 crc kubenswrapper[4675]: I0216 19:44:24.848185 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ctjpc\" (UID: \"2a8b7c95-1c8f-4de9-907c-34b0b0848b13\") " pod="openshift-image-registry/image-registry-697d97f7c8-ctjpc" Feb 16 19:44:24 crc kubenswrapper[4675]: I0216 19:44:24.849339 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-n5h4j" event={"ID":"f65c0c4b-f568-44de-8408-2724a7841b62","Type":"ContainerStarted","Data":"cbc51463b9f51387dba7ecb02e293d7c22dafadfb3d65f22fef86c9baaa31a9c"} Feb 16 19:44:24 crc kubenswrapper[4675]: E0216 19:44:24.849681 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 19:44:25.34966601 +0000 UTC m=+148.474955566 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ctjpc" (UID: "2a8b7c95-1c8f-4de9-907c-34b0b0848b13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:24 crc kubenswrapper[4675]: I0216 19:44:24.881233 4675 generic.go:334] "Generic (PLEG): container finished" podID="76908270-f3ef-412b-a18d-065cf006461e" containerID="8f037b932edbbf85fc5bc4ffc79d6accfdd50b4414825114fbfaceee721f7138" exitCode=0 Feb 16 19:44:24 crc kubenswrapper[4675]: I0216 19:44:24.881906 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-s7kmk" event={"ID":"76908270-f3ef-412b-a18d-065cf006461e","Type":"ContainerDied","Data":"8f037b932edbbf85fc5bc4ffc79d6accfdd50b4414825114fbfaceee721f7138"} Feb 16 19:44:24 crc kubenswrapper[4675]: I0216 19:44:24.894240 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dttlz" event={"ID":"4178aeb7-6531-47e8-bf0d-695fbb18bc89","Type":"ContainerStarted","Data":"bf2eb722fdba7efdc1f256cda668aa4cc147fe1b9b459b598462108011a7b8d0"} Feb 16 19:44:24 crc kubenswrapper[4675]: I0216 19:44:24.895108 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dttlz" Feb 16 19:44:24 crc kubenswrapper[4675]: I0216 19:44:24.911678 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-p2j88" event={"ID":"2d280f5d-e193-45f2-8422-fa0a4177c833","Type":"ContainerStarted","Data":"ab24f5237efd7d194e8e0fdcaf60367dc2c21c2820c5c53bb1d075c252b25293"} Feb 16 19:44:24 crc kubenswrapper[4675]: I0216 19:44:24.913133 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-p2j88" Feb 16 19:44:24 crc kubenswrapper[4675]: I0216 19:44:24.955667 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ncsct" event={"ID":"8c66cb83-e800-414f-a637-18fd6c6423e5","Type":"ContainerStarted","Data":"aeefd247398bbb7fd7ac66e4be77fcbe4bd7e900ba18a1fb7f3f983a59997457"} Feb 16 19:44:24 crc kubenswrapper[4675]: I0216 19:44:24.955936 4675 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-p2j88 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.42:5443/healthz\": dial tcp 10.217.0.42:5443: connect: connection refused" start-of-body= Feb 16 19:44:24 crc kubenswrapper[4675]: I0216 19:44:24.955981 4675 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-p2j88" podUID="2d280f5d-e193-45f2-8422-fa0a4177c833" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.42:5443/healthz\": dial tcp 10.217.0.42:5443: connect: connection refused" Feb 16 19:44:24 crc kubenswrapper[4675]: I0216 19:44:24.956720 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 19:44:24 crc kubenswrapper[4675]: E0216 19:44:24.956872 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 19:44:25.456850973 +0000 UTC m=+148.582140529 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:24 crc kubenswrapper[4675]: I0216 19:44:24.957570 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ctjpc\" (UID: \"2a8b7c95-1c8f-4de9-907c-34b0b0848b13\") " pod="openshift-image-registry/image-registry-697d97f7c8-ctjpc" Feb 16 19:44:24 crc kubenswrapper[4675]: E0216 19:44:24.958211 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 19:44:25.458185768 +0000 UTC m=+148.583475504 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ctjpc" (UID: "2a8b7c95-1c8f-4de9-907c-34b0b0848b13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:25 crc kubenswrapper[4675]: I0216 19:44:25.010010 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ggdkr" event={"ID":"9ed87087-af1e-4042-b3ba-000095c2b183","Type":"ContainerStarted","Data":"39c21c9a81bac42d410be05304c6f938c24b3d472c8bf5764386590f9c442900"} Feb 16 19:44:25 crc kubenswrapper[4675]: I0216 19:44:25.027051 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-lsjl6" event={"ID":"c45bcd4d-0bb4-4813-9eb9-5864e4f46bce","Type":"ContainerStarted","Data":"84ddbcb413ca05031b8d9e26951ff9eb0ea08dff74a6d9f38d847b9db0d1ad84"} Feb 16 19:44:25 crc kubenswrapper[4675]: I0216 19:44:25.037178 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4l4lr" podStartSLOduration=125.037152177 podStartE2EDuration="2m5.037152177s" podCreationTimestamp="2026-02-16 19:42:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 19:44:25.019663707 +0000 UTC m=+148.144953263" watchObservedRunningTime="2026-02-16 19:44:25.037152177 +0000 UTC m=+148.162441733" Feb 16 19:44:25 crc kubenswrapper[4675]: I0216 19:44:25.039832 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-88zb5" podStartSLOduration=7.039823498 podStartE2EDuration="7.039823498s" podCreationTimestamp="2026-02-16 19:44:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 19:44:24.905776418 +0000 UTC m=+148.031065974" watchObservedRunningTime="2026-02-16 19:44:25.039823498 +0000 UTC m=+148.165113054" Feb 16 19:44:25 crc kubenswrapper[4675]: I0216 19:44:25.058638 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 19:44:25 crc kubenswrapper[4675]: I0216 19:44:25.058791 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-846bd" event={"ID":"02491684-b05b-4878-b335-ba69ffbe08c9","Type":"ContainerStarted","Data":"1b138d636823029f66a49a61f97240483a2691342367b43fbca615dc3b27cc31"} Feb 16 19:44:25 crc kubenswrapper[4675]: E0216 19:44:25.059808 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 19:44:25.559789413 +0000 UTC m=+148.685078969 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:25 crc kubenswrapper[4675]: I0216 19:44:25.063116 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6kj4n" event={"ID":"408894c5-c798-48ff-93ac-bc8ea114ee4a","Type":"ContainerStarted","Data":"87f352d61dab8b390f045f5835b03edb64f2e820328abd145ad4f940204f38a5"} Feb 16 19:44:25 crc kubenswrapper[4675]: I0216 19:44:25.082116 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-hrp82" event={"ID":"e02c5612-7f04-4ff9-902e-45c4436e01ed","Type":"ContainerStarted","Data":"ee09b544ee2d2cad3a7f57e186e4df58f5a5cde60103cd65be723de1422a4e36"} Feb 16 19:44:25 crc kubenswrapper[4675]: I0216 19:44:25.124459 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dttlz" podStartSLOduration=126.124439196 podStartE2EDuration="2m6.124439196s" podCreationTimestamp="2026-02-16 19:42:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 19:44:25.124261691 +0000 UTC m=+148.249551257" watchObservedRunningTime="2026-02-16 19:44:25.124439196 +0000 UTC m=+148.249728752" Feb 16 19:44:25 crc kubenswrapper[4675]: I0216 19:44:25.124745 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dvv45" event={"ID":"4157206a-b996-4d4c-8a5f-fc820bfaed06","Type":"ContainerStarted","Data":"515943a3e98ea6d56c5914e72cd12a10e3b80bae895160ecde7c98cdc8dde1fc"} Feb 16 19:44:25 crc kubenswrapper[4675]: I0216 19:44:25.126003 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dvv45" Feb 16 19:44:25 crc kubenswrapper[4675]: I0216 19:44:25.148363 4675 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-dvv45 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" start-of-body= Feb 16 19:44:25 crc kubenswrapper[4675]: I0216 19:44:25.148429 4675 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dvv45" podUID="4157206a-b996-4d4c-8a5f-fc820bfaed06" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" Feb 16 19:44:25 crc kubenswrapper[4675]: I0216 19:44:25.167909 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ctjpc\" (UID: \"2a8b7c95-1c8f-4de9-907c-34b0b0848b13\") " pod="openshift-image-registry/image-registry-697d97f7c8-ctjpc" Feb 16 19:44:25 crc kubenswrapper[4675]: E0216 19:44:25.170459 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 19:44:25.670446497 +0000 UTC m=+148.795736053 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ctjpc" (UID: "2a8b7c95-1c8f-4de9-907c-34b0b0848b13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:25 crc kubenswrapper[4675]: I0216 19:44:25.203601 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-fj4lh" event={"ID":"cec4d170-3bfa-482c-9dea-06ada6534e6c","Type":"ContainerStarted","Data":"1c2b89aa7a9c2239739ce4a0eb0155363fd523fe83a1ee36bf66550fb1dd4e93"} Feb 16 19:44:25 crc kubenswrapper[4675]: I0216 19:44:25.225984 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6rksb" event={"ID":"2cd9132b-47d7-4a19-841b-cc409d55c5ae","Type":"ContainerStarted","Data":"16c61232fe4a42010024064cf209abc33ea268f21206060d8d44b649d80270a6"} Feb 16 19:44:25 crc kubenswrapper[4675]: I0216 19:44:25.268626 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29521170-6rzxb" event={"ID":"7a536c24-ac8f-4add-96e8-064bbcd40ba6","Type":"ContainerStarted","Data":"d92a1b6d8f41970b526ab4f11e1507cd77a28a49f971c7a70d47d1aa509a02c5"} Feb 16 19:44:25 crc kubenswrapper[4675]: I0216 19:44:25.268728 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29521170-6rzxb" event={"ID":"7a536c24-ac8f-4add-96e8-064bbcd40ba6","Type":"ContainerStarted","Data":"aaf64f03c469b364570831b6285e8fa513b7b9ff1ed74930e06e5806c63ea08f"} Feb 16 19:44:25 crc kubenswrapper[4675]: I0216 19:44:25.268922 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 19:44:25 crc kubenswrapper[4675]: E0216 19:44:25.270443 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 19:44:25.77042177 +0000 UTC m=+148.895711326 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:25 crc kubenswrapper[4675]: I0216 19:44:25.313047 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-lsjl6" podStartSLOduration=125.313014332 podStartE2EDuration="2m5.313014332s" podCreationTimestamp="2026-02-16 19:42:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 19:44:25.310318501 +0000 UTC m=+148.435608057" watchObservedRunningTime="2026-02-16 19:44:25.313014332 +0000 UTC m=+148.438303888" Feb 16 19:44:25 crc kubenswrapper[4675]: I0216 19:44:25.313911 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-g82z9" podStartSLOduration=126.313902055 podStartE2EDuration="2m6.313902055s" podCreationTimestamp="2026-02-16 19:42:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 19:44:25.252918319 +0000 UTC m=+148.378207875" watchObservedRunningTime="2026-02-16 19:44:25.313902055 +0000 UTC m=+148.439191601" Feb 16 19:44:25 crc kubenswrapper[4675]: I0216 19:44:25.328623 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-6sj8v" event={"ID":"a367d97c-eb93-4e8b-a42d-6696bb381617","Type":"ContainerStarted","Data":"a92712297f9c9e8e973e59ad0aee264bc1f744772ea8506802f1a48c737d596f"} Feb 16 19:44:25 crc kubenswrapper[4675]: I0216 19:44:25.359110 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hksfk" event={"ID":"62c9795a-67ff-43f2-a303-fdc91be1494d","Type":"ContainerStarted","Data":"742dfd73b139001de451c64f24700f93d3532df449b48ca19157fd385ccf5460"} Feb 16 19:44:25 crc kubenswrapper[4675]: I0216 19:44:25.359358 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-hrp82" podStartSLOduration=125.359337791 podStartE2EDuration="2m5.359337791s" podCreationTimestamp="2026-02-16 19:42:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 19:44:25.355353457 +0000 UTC m=+148.480643033" watchObservedRunningTime="2026-02-16 19:44:25.359337791 +0000 UTC m=+148.484627347" Feb 16 19:44:25 crc kubenswrapper[4675]: I0216 19:44:25.371172 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ctjpc\" (UID: \"2a8b7c95-1c8f-4de9-907c-34b0b0848b13\") " pod="openshift-image-registry/image-registry-697d97f7c8-ctjpc" Feb 16 19:44:25 crc kubenswrapper[4675]: E0216 19:44:25.372782 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 19:44:25.872764365 +0000 UTC m=+148.998053931 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ctjpc" (UID: "2a8b7c95-1c8f-4de9-907c-34b0b0848b13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:25 crc kubenswrapper[4675]: I0216 19:44:25.399045 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-lj8jk" event={"ID":"8de617fc-9ec5-4fd7-81a3-d1c7621bf288","Type":"ContainerStarted","Data":"b8fe0c42810adbf2a69b98be738a93bcc77e4fece32d3dffc062fae18c57cec0"} Feb 16 19:44:25 crc kubenswrapper[4675]: I0216 19:44:25.480496 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-944wg" event={"ID":"7ce9d25c-a4a1-4414-8ed3-97ace0e8a309","Type":"ContainerStarted","Data":"c0346c67d19a15acc04ccb9bb2599bb10319a714f20876fde8f41dd891a557db"} Feb 16 19:44:25 crc kubenswrapper[4675]: I0216 19:44:25.481704 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-944wg" Feb 16 19:44:25 crc kubenswrapper[4675]: I0216 19:44:25.483963 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 19:44:25 crc kubenswrapper[4675]: E0216 19:44:25.485465 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 19:44:25.985440632 +0000 UTC m=+149.110730188 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:25 crc kubenswrapper[4675]: I0216 19:44:25.493067 4675 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-944wg container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" start-of-body= Feb 16 19:44:25 crc kubenswrapper[4675]: I0216 19:44:25.493149 4675 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-944wg" podUID="7ce9d25c-a4a1-4414-8ed3-97ace0e8a309" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" Feb 16 19:44:25 crc kubenswrapper[4675]: I0216 19:44:25.508319 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-hrp82" Feb 16 19:44:25 crc kubenswrapper[4675]: I0216 19:44:25.518917 4675 patch_prober.go:28] interesting pod/router-default-5444994796-hrp82 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 16 19:44:25 crc kubenswrapper[4675]: [-]has-synced failed: reason withheld Feb 16 19:44:25 crc kubenswrapper[4675]: [+]process-running ok Feb 16 19:44:25 crc kubenswrapper[4675]: healthz check failed Feb 16 19:44:25 crc kubenswrapper[4675]: I0216 19:44:25.518992 4675 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hrp82" podUID="e02c5612-7f04-4ff9-902e-45c4436e01ed" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 16 19:44:25 crc kubenswrapper[4675]: I0216 19:44:25.542430 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-p2j88" podStartSLOduration=125.542404212 podStartE2EDuration="2m5.542404212s" podCreationTimestamp="2026-02-16 19:42:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 19:44:25.459739525 +0000 UTC m=+148.585029091" watchObservedRunningTime="2026-02-16 19:44:25.542404212 +0000 UTC m=+148.667693788" Feb 16 19:44:25 crc kubenswrapper[4675]: I0216 19:44:25.586562 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ctjpc\" (UID: \"2a8b7c95-1c8f-4de9-907c-34b0b0848b13\") " pod="openshift-image-registry/image-registry-697d97f7c8-ctjpc" Feb 16 19:44:25 crc kubenswrapper[4675]: E0216 19:44:25.587023 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 19:44:26.087006267 +0000 UTC m=+149.212295823 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ctjpc" (UID: "2a8b7c95-1c8f-4de9-907c-34b0b0848b13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:25 crc kubenswrapper[4675]: I0216 19:44:25.594970 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-n8s5c" event={"ID":"6b1e44b2-cd12-41e7-9cbd-99509b1cdf84","Type":"ContainerStarted","Data":"107133f3177c9eaf058f1cc39a3a36136ab6738b82121b013c48dd6a73dbc5fb"} Feb 16 19:44:25 crc kubenswrapper[4675]: I0216 19:44:25.624794 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-7grkh" event={"ID":"3ff2ffa0-19be-4c14-858d-0ecd570876fb","Type":"ContainerStarted","Data":"728385810d3c0247293419b326e7954c9521e28c4408b08bb7ca363e509e44f1"} Feb 16 19:44:25 crc kubenswrapper[4675]: I0216 19:44:25.628356 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-846bd" podStartSLOduration=125.628331585 podStartE2EDuration="2m5.628331585s" podCreationTimestamp="2026-02-16 19:42:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 19:44:25.543885861 +0000 UTC m=+148.669175437" watchObservedRunningTime="2026-02-16 19:44:25.628331585 +0000 UTC m=+148.753621161" Feb 16 19:44:25 crc kubenswrapper[4675]: I0216 19:44:25.690756 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 19:44:25 crc kubenswrapper[4675]: E0216 19:44:25.691840 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 19:44:26.191820337 +0000 UTC m=+149.317109883 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:25 crc kubenswrapper[4675]: I0216 19:44:25.704246 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hq42n" event={"ID":"1b4bbcb2-7564-4018-9d41-2ecd0cc7a80f","Type":"ContainerStarted","Data":"46dd974af5ddde5454ee25126439460aa27a93d138ceba077cc847d76399071b"} Feb 16 19:44:25 crc kubenswrapper[4675]: I0216 19:44:25.704320 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hq42n" event={"ID":"1b4bbcb2-7564-4018-9d41-2ecd0cc7a80f","Type":"ContainerStarted","Data":"95e1ceab45060b76ed54b346ac6af271fe99188eab8b27fc745088a830be9c11"} Feb 16 19:44:25 crc kubenswrapper[4675]: I0216 19:44:25.716356 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dvv45" podStartSLOduration=125.716333492 podStartE2EDuration="2m5.716333492s" podCreationTimestamp="2026-02-16 19:42:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 19:44:25.637595779 +0000 UTC m=+148.762885325" watchObservedRunningTime="2026-02-16 19:44:25.716333492 +0000 UTC m=+148.841623048" Feb 16 19:44:25 crc kubenswrapper[4675]: I0216 19:44:25.717551 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-fj4lh" podStartSLOduration=125.717545974 podStartE2EDuration="2m5.717545974s" podCreationTimestamp="2026-02-16 19:42:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 19:44:25.715948472 +0000 UTC m=+148.841238028" watchObservedRunningTime="2026-02-16 19:44:25.717545974 +0000 UTC m=+148.842835530" Feb 16 19:44:25 crc kubenswrapper[4675]: I0216 19:44:25.764502 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-55l96" event={"ID":"b5a72330-c8e5-4be3-8083-ced47a7b6ada","Type":"ContainerStarted","Data":"63b74ff70e37b150a2ae3bc386d8433b45f1a4842761386dc46a197d6fb2c137"} Feb 16 19:44:25 crc kubenswrapper[4675]: I0216 19:44:25.764569 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-55l96" Feb 16 19:44:25 crc kubenswrapper[4675]: I0216 19:44:25.784366 4675 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-55l96 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.17:6443/healthz\": dial tcp 10.217.0.17:6443: connect: connection refused" start-of-body= Feb 16 19:44:25 crc kubenswrapper[4675]: I0216 19:44:25.784445 4675 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-55l96" podUID="b5a72330-c8e5-4be3-8083-ced47a7b6ada" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.17:6443/healthz\": dial tcp 10.217.0.17:6443: connect: connection refused" Feb 16 19:44:25 crc kubenswrapper[4675]: I0216 19:44:25.785143 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6rksb" podStartSLOduration=125.785130054 podStartE2EDuration="2m5.785130054s" podCreationTimestamp="2026-02-16 19:42:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 19:44:25.784724993 +0000 UTC m=+148.910014549" watchObservedRunningTime="2026-02-16 19:44:25.785130054 +0000 UTC m=+148.910419610" Feb 16 19:44:25 crc kubenswrapper[4675]: I0216 19:44:25.804894 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-86tqb" event={"ID":"c40cd259-a202-462a-95de-34b6eb92c90c","Type":"ContainerStarted","Data":"7e1fa6ed8bcd4af054233ec93399073d507de6e81728dd266d48092c58b1ddec"} Feb 16 19:44:25 crc kubenswrapper[4675]: I0216 19:44:25.805526 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-86tqb" event={"ID":"c40cd259-a202-462a-95de-34b6eb92c90c","Type":"ContainerStarted","Data":"9107351f9adbb16f68968bb50f5ae54d347a141ee043ce79c044e4ac0389c12d"} Feb 16 19:44:25 crc kubenswrapper[4675]: I0216 19:44:25.820794 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-vjgs5" event={"ID":"b3876426-ba16-40ed-a0f8-aeb0cdb9ae0f","Type":"ContainerStarted","Data":"80b7d6da2d084b644618543e206fae2d2fdc7a7af144f298b556191369193266"} Feb 16 19:44:25 crc kubenswrapper[4675]: I0216 19:44:25.835902 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ctjpc\" (UID: \"2a8b7c95-1c8f-4de9-907c-34b0b0848b13\") " pod="openshift-image-registry/image-registry-697d97f7c8-ctjpc" Feb 16 19:44:25 crc kubenswrapper[4675]: E0216 19:44:25.837679 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 19:44:26.337660757 +0000 UTC m=+149.462950313 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ctjpc" (UID: "2a8b7c95-1c8f-4de9-907c-34b0b0848b13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:25 crc kubenswrapper[4675]: I0216 19:44:25.838338 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7kckm" event={"ID":"13c31ee7-d6e5-4796-b0f9-22111647def3","Type":"ContainerStarted","Data":"aba912ff7d1b6f2e3ed0023843485e2e0b1604d9ce25c699336b08f108977815"} Feb 16 19:44:25 crc kubenswrapper[4675]: I0216 19:44:25.839082 4675 patch_prober.go:28] interesting pod/downloads-7954f5f757-6k7j5 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.33:8080/\": dial tcp 10.217.0.33:8080: connect: connection refused" start-of-body= Feb 16 19:44:25 crc kubenswrapper[4675]: I0216 19:44:25.839143 4675 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-6k7j5" podUID="5d5f11f1-66e4-4700-9c37-f3843d1769a5" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.33:8080/\": dial tcp 10.217.0.33:8080: connect: connection refused" Feb 16 19:44:25 crc kubenswrapper[4675]: I0216 19:44:25.886287 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-6sj8v" podStartSLOduration=125.886265147 podStartE2EDuration="2m5.886265147s" podCreationTimestamp="2026-02-16 19:42:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 19:44:25.83815562 +0000 UTC m=+148.963445176" watchObservedRunningTime="2026-02-16 19:44:25.886265147 +0000 UTC m=+149.011554693" Feb 16 19:44:25 crc kubenswrapper[4675]: I0216 19:44:25.887861 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-944wg" podStartSLOduration=125.887839528 podStartE2EDuration="2m5.887839528s" podCreationTimestamp="2026-02-16 19:42:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 19:44:25.885112576 +0000 UTC m=+149.010402142" watchObservedRunningTime="2026-02-16 19:44:25.887839528 +0000 UTC m=+149.013129084" Feb 16 19:44:25 crc kubenswrapper[4675]: I0216 19:44:25.941287 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 19:44:25 crc kubenswrapper[4675]: E0216 19:44:25.942844 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 19:44:26.442828746 +0000 UTC m=+149.568118292 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:25 crc kubenswrapper[4675]: I0216 19:44:25.947024 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hq42n" podStartSLOduration=125.947000916 podStartE2EDuration="2m5.947000916s" podCreationTimestamp="2026-02-16 19:42:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 19:44:25.923393774 +0000 UTC m=+149.048683341" watchObservedRunningTime="2026-02-16 19:44:25.947000916 +0000 UTC m=+149.072290472" Feb 16 19:44:26 crc kubenswrapper[4675]: I0216 19:44:26.043758 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ctjpc\" (UID: \"2a8b7c95-1c8f-4de9-907c-34b0b0848b13\") " pod="openshift-image-registry/image-registry-697d97f7c8-ctjpc" Feb 16 19:44:26 crc kubenswrapper[4675]: E0216 19:44:26.044233 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 19:44:26.544215226 +0000 UTC m=+149.669504772 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ctjpc" (UID: "2a8b7c95-1c8f-4de9-907c-34b0b0848b13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:26 crc kubenswrapper[4675]: I0216 19:44:26.096243 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29521170-6rzxb" podStartSLOduration=127.096221056 podStartE2EDuration="2m7.096221056s" podCreationTimestamp="2026-02-16 19:42:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 19:44:26.095147957 +0000 UTC m=+149.220437523" watchObservedRunningTime="2026-02-16 19:44:26.096221056 +0000 UTC m=+149.221510612" Feb 16 19:44:26 crc kubenswrapper[4675]: I0216 19:44:26.098452 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-7grkh" podStartSLOduration=8.098444894 podStartE2EDuration="8.098444894s" podCreationTimestamp="2026-02-16 19:44:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 19:44:26.049268299 +0000 UTC m=+149.174557845" watchObservedRunningTime="2026-02-16 19:44:26.098444894 +0000 UTC m=+149.223734450" Feb 16 19:44:26 crc kubenswrapper[4675]: I0216 19:44:26.159334 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 19:44:26 crc kubenswrapper[4675]: E0216 19:44:26.159787 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 19:44:26.659771149 +0000 UTC m=+149.785060705 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:26 crc kubenswrapper[4675]: I0216 19:44:26.201772 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-55l96" podStartSLOduration=127.201746094 podStartE2EDuration="2m7.201746094s" podCreationTimestamp="2026-02-16 19:42:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 19:44:26.194108223 +0000 UTC m=+149.319397789" watchObservedRunningTime="2026-02-16 19:44:26.201746094 +0000 UTC m=+149.327035650" Feb 16 19:44:26 crc kubenswrapper[4675]: I0216 19:44:26.264261 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ctjpc\" (UID: \"2a8b7c95-1c8f-4de9-907c-34b0b0848b13\") " pod="openshift-image-registry/image-registry-697d97f7c8-ctjpc" Feb 16 19:44:26 crc kubenswrapper[4675]: E0216 19:44:26.264673 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 19:44:26.764658091 +0000 UTC m=+149.889947637 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ctjpc" (UID: "2a8b7c95-1c8f-4de9-907c-34b0b0848b13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:26 crc kubenswrapper[4675]: I0216 19:44:26.325007 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hksfk" podStartSLOduration=126.324980169 podStartE2EDuration="2m6.324980169s" podCreationTimestamp="2026-02-16 19:42:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 19:44:26.271961763 +0000 UTC m=+149.397251319" watchObservedRunningTime="2026-02-16 19:44:26.324980169 +0000 UTC m=+149.450269725" Feb 16 19:44:26 crc kubenswrapper[4675]: I0216 19:44:26.366635 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 19:44:26 crc kubenswrapper[4675]: E0216 19:44:26.367181 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 19:44:26.86715746 +0000 UTC m=+149.992447016 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:26 crc kubenswrapper[4675]: I0216 19:44:26.471803 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ctjpc\" (UID: \"2a8b7c95-1c8f-4de9-907c-34b0b0848b13\") " pod="openshift-image-registry/image-registry-697d97f7c8-ctjpc" Feb 16 19:44:26 crc kubenswrapper[4675]: E0216 19:44:26.472246 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 19:44:26.972232957 +0000 UTC m=+150.097522513 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ctjpc" (UID: "2a8b7c95-1c8f-4de9-907c-34b0b0848b13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:26 crc kubenswrapper[4675]: I0216 19:44:26.524327 4675 patch_prober.go:28] interesting pod/router-default-5444994796-hrp82 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 16 19:44:26 crc kubenswrapper[4675]: [-]has-synced failed: reason withheld Feb 16 19:44:26 crc kubenswrapper[4675]: [+]process-running ok Feb 16 19:44:26 crc kubenswrapper[4675]: healthz check failed Feb 16 19:44:26 crc kubenswrapper[4675]: I0216 19:44:26.524386 4675 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hrp82" podUID="e02c5612-7f04-4ff9-902e-45c4436e01ed" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 16 19:44:26 crc kubenswrapper[4675]: I0216 19:44:26.574884 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 19:44:26 crc kubenswrapper[4675]: E0216 19:44:26.575395 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 19:44:27.075369163 +0000 UTC m=+150.200658729 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:26 crc kubenswrapper[4675]: I0216 19:44:26.680001 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 19:44:26 crc kubenswrapper[4675]: I0216 19:44:26.680343 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ctjpc\" (UID: \"2a8b7c95-1c8f-4de9-907c-34b0b0848b13\") " pod="openshift-image-registry/image-registry-697d97f7c8-ctjpc" Feb 16 19:44:26 crc kubenswrapper[4675]: I0216 19:44:26.680380 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 19:44:26 crc kubenswrapper[4675]: I0216 19:44:26.682753 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 19:44:26 crc kubenswrapper[4675]: E0216 19:44:26.683218 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 19:44:27.183194502 +0000 UTC m=+150.308484058 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ctjpc" (UID: "2a8b7c95-1c8f-4de9-907c-34b0b0848b13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:26 crc kubenswrapper[4675]: I0216 19:44:26.695637 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 19:44:26 crc kubenswrapper[4675]: I0216 19:44:26.782254 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 19:44:26 crc kubenswrapper[4675]: I0216 19:44:26.782520 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 19:44:26 crc kubenswrapper[4675]: I0216 19:44:26.782640 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 19:44:26 crc kubenswrapper[4675]: E0216 19:44:26.783394 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 19:44:27.28335292 +0000 UTC m=+150.408642466 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:26 crc kubenswrapper[4675]: I0216 19:44:26.798555 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 19:44:26 crc kubenswrapper[4675]: I0216 19:44:26.810402 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 19:44:26 crc kubenswrapper[4675]: I0216 19:44:26.887708 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ctjpc\" (UID: \"2a8b7c95-1c8f-4de9-907c-34b0b0848b13\") " pod="openshift-image-registry/image-registry-697d97f7c8-ctjpc" Feb 16 19:44:26 crc kubenswrapper[4675]: E0216 19:44:26.888088 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 19:44:27.388074577 +0000 UTC m=+150.513364133 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ctjpc" (UID: "2a8b7c95-1c8f-4de9-907c-34b0b0848b13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:26 crc kubenswrapper[4675]: I0216 19:44:26.899243 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 19:44:26 crc kubenswrapper[4675]: I0216 19:44:26.906902 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 19:44:26 crc kubenswrapper[4675]: I0216 19:44:26.913858 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 19:44:26 crc kubenswrapper[4675]: I0216 19:44:26.929633 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fg6bd" event={"ID":"82473df6-10c6-4c55-9eb0-c0a98830ff79","Type":"ContainerStarted","Data":"8a09f3ab67aa22806681e4c7e955c9fa904d2a010da59e494982c7d34ae2603e"} Feb 16 19:44:26 crc kubenswrapper[4675]: I0216 19:44:26.980395 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-vjgs5" event={"ID":"b3876426-ba16-40ed-a0f8-aeb0cdb9ae0f","Type":"ContainerStarted","Data":"701dbe4a442b6c70817735f6bd5a5a09cafda2979cd277df23bb50bb2536611c"} Feb 16 19:44:26 crc kubenswrapper[4675]: I0216 19:44:26.990212 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 19:44:26 crc kubenswrapper[4675]: E0216 19:44:26.991395 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 19:44:27.491368687 +0000 UTC m=+150.616658233 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:27 crc kubenswrapper[4675]: I0216 19:44:27.046378 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-s7kmk" event={"ID":"76908270-f3ef-412b-a18d-065cf006461e","Type":"ContainerStarted","Data":"3a1d82dbb39c8e7d7e69bf86581120b06348e0e234bcdb14974ba288e5940ad4"} Feb 16 19:44:27 crc kubenswrapper[4675]: I0216 19:44:27.065174 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-lsjl6" event={"ID":"c45bcd4d-0bb4-4813-9eb9-5864e4f46bce","Type":"ContainerStarted","Data":"cb180475b9ca3f841f8ea71586e999b5e19ed4415353e357ff3e17dc14f49f8c"} Feb 16 19:44:27 crc kubenswrapper[4675]: I0216 19:44:27.075350 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7kckm" podStartSLOduration=128.075331058 podStartE2EDuration="2m8.075331058s" podCreationTimestamp="2026-02-16 19:42:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 19:44:26.327370952 +0000 UTC m=+149.452660508" watchObservedRunningTime="2026-02-16 19:44:27.075331058 +0000 UTC m=+150.200620614" Feb 16 19:44:27 crc kubenswrapper[4675]: I0216 19:44:27.092835 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ctjpc\" (UID: \"2a8b7c95-1c8f-4de9-907c-34b0b0848b13\") " pod="openshift-image-registry/image-registry-697d97f7c8-ctjpc" Feb 16 19:44:27 crc kubenswrapper[4675]: E0216 19:44:27.093328 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 19:44:27.593311112 +0000 UTC m=+150.718600668 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ctjpc" (UID: "2a8b7c95-1c8f-4de9-907c-34b0b0848b13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:27 crc kubenswrapper[4675]: I0216 19:44:27.131884 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-lj8jk" event={"ID":"8de617fc-9ec5-4fd7-81a3-d1c7621bf288","Type":"ContainerStarted","Data":"a04d77931a9f33c01d122195da3c3b9bad257390bca6dc0a9ae7fc5bc655a543"} Feb 16 19:44:27 crc kubenswrapper[4675]: I0216 19:44:27.196172 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 19:44:27 crc kubenswrapper[4675]: E0216 19:44:27.197790 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 19:44:27.697767532 +0000 UTC m=+150.823057088 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:27 crc kubenswrapper[4675]: I0216 19:44:27.226081 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-n8s5c" event={"ID":"6b1e44b2-cd12-41e7-9cbd-99509b1cdf84","Type":"ContainerStarted","Data":"423fb664765ab382e05c2be1dd2dbacb914c2305602ef3a2443f8425e3ec8071"} Feb 16 19:44:27 crc kubenswrapper[4675]: I0216 19:44:27.226134 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-n8s5c" event={"ID":"6b1e44b2-cd12-41e7-9cbd-99509b1cdf84","Type":"ContainerStarted","Data":"d004dd1b4f4010193041379f2cd5da6752a61aace69a07421960ef9ed41f409d"} Feb 16 19:44:27 crc kubenswrapper[4675]: I0216 19:44:27.226643 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-n8s5c" Feb 16 19:44:27 crc kubenswrapper[4675]: I0216 19:44:27.231909 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-lj8jk" podStartSLOduration=127.23188135 podStartE2EDuration="2m7.23188135s" podCreationTimestamp="2026-02-16 19:42:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 19:44:27.225165473 +0000 UTC m=+150.350455049" watchObservedRunningTime="2026-02-16 19:44:27.23188135 +0000 UTC m=+150.357170906" Feb 16 19:44:27 crc kubenswrapper[4675]: I0216 19:44:27.232634 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fg6bd" podStartSLOduration=127.2326277 podStartE2EDuration="2m7.2326277s" podCreationTimestamp="2026-02-16 19:42:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 19:44:27.083280257 +0000 UTC m=+150.208569833" watchObservedRunningTime="2026-02-16 19:44:27.2326277 +0000 UTC m=+150.357917256" Feb 16 19:44:27 crc kubenswrapper[4675]: I0216 19:44:27.252201 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-7grkh" event={"ID":"3ff2ffa0-19be-4c14-858d-0ecd570876fb","Type":"ContainerStarted","Data":"9c262b1f666468b804dc32663d1fd96ba79cc2ec19762a924f4b9b60f83f640c"} Feb 16 19:44:27 crc kubenswrapper[4675]: I0216 19:44:27.286460 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ggdkr" event={"ID":"9ed87087-af1e-4042-b3ba-000095c2b183","Type":"ContainerStarted","Data":"f3cfbd25cffae8ad5fea64c6738b289cef9e923ca5a43c6136cc086a59502f74"} Feb 16 19:44:27 crc kubenswrapper[4675]: I0216 19:44:27.293784 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-n8s5c" podStartSLOduration=9.29376318 podStartE2EDuration="9.29376318s" podCreationTimestamp="2026-02-16 19:44:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 19:44:27.29300574 +0000 UTC m=+150.418295296" watchObservedRunningTime="2026-02-16 19:44:27.29376318 +0000 UTC m=+150.419052736" Feb 16 19:44:27 crc kubenswrapper[4675]: I0216 19:44:27.297720 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ctjpc\" (UID: \"2a8b7c95-1c8f-4de9-907c-34b0b0848b13\") " pod="openshift-image-registry/image-registry-697d97f7c8-ctjpc" Feb 16 19:44:27 crc kubenswrapper[4675]: E0216 19:44:27.298124 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 19:44:27.798108944 +0000 UTC m=+150.923398490 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ctjpc" (UID: "2a8b7c95-1c8f-4de9-907c-34b0b0848b13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:27 crc kubenswrapper[4675]: I0216 19:44:27.310668 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-n5h4j" event={"ID":"f65c0c4b-f568-44de-8408-2724a7841b62","Type":"ContainerStarted","Data":"8f1d1da75e45fef3a8a185b75ef9bf0f7ab9ed0c2bbf33692bbfd31549854b65"} Feb 16 19:44:27 crc kubenswrapper[4675]: I0216 19:44:27.310744 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-n5h4j" event={"ID":"f65c0c4b-f568-44de-8408-2724a7841b62","Type":"ContainerStarted","Data":"497e57a9586f174fc56188161ce7ce1ff4aae96cc6792eac1fcb35da768ae8fe"} Feb 16 19:44:27 crc kubenswrapper[4675]: I0216 19:44:27.311588 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-n5h4j" Feb 16 19:44:27 crc kubenswrapper[4675]: I0216 19:44:27.369363 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-86tqb" event={"ID":"c40cd259-a202-462a-95de-34b6eb92c90c","Type":"ContainerStarted","Data":"72c416d1d89d4e9beff7689f4ba95dd8b49e066fe974974be8d8820e37ed0261"} Feb 16 19:44:27 crc kubenswrapper[4675]: I0216 19:44:27.385333 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ggdkr" podStartSLOduration=127.385312991 podStartE2EDuration="2m7.385312991s" podCreationTimestamp="2026-02-16 19:42:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 19:44:27.338619121 +0000 UTC m=+150.463908677" watchObservedRunningTime="2026-02-16 19:44:27.385312991 +0000 UTC m=+150.510602547" Feb 16 19:44:27 crc kubenswrapper[4675]: I0216 19:44:27.399862 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 19:44:27 crc kubenswrapper[4675]: E0216 19:44:27.401051 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 19:44:27.901030505 +0000 UTC m=+151.026320061 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:27 crc kubenswrapper[4675]: I0216 19:44:27.419680 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-944wg" event={"ID":"7ce9d25c-a4a1-4414-8ed3-97ace0e8a309","Type":"ContainerStarted","Data":"78eb39ec4afd62fd7bd6df45c23ef489db370f46e87346d1fa9f1020a271083e"} Feb 16 19:44:27 crc kubenswrapper[4675]: I0216 19:44:27.434909 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-n5h4j" podStartSLOduration=127.434887076 podStartE2EDuration="2m7.434887076s" podCreationTimestamp="2026-02-16 19:42:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 19:44:27.389400608 +0000 UTC m=+150.514690164" watchObservedRunningTime="2026-02-16 19:44:27.434887076 +0000 UTC m=+150.560176632" Feb 16 19:44:27 crc kubenswrapper[4675]: I0216 19:44:27.435491 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-86tqb" podStartSLOduration=127.435484462 podStartE2EDuration="2m7.435484462s" podCreationTimestamp="2026-02-16 19:42:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 19:44:27.433281564 +0000 UTC m=+150.558571120" watchObservedRunningTime="2026-02-16 19:44:27.435484462 +0000 UTC m=+150.560774018" Feb 16 19:44:27 crc kubenswrapper[4675]: I0216 19:44:27.468302 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dvv45" event={"ID":"4157206a-b996-4d4c-8a5f-fc820bfaed06","Type":"ContainerStarted","Data":"82cac54a5200a374fc34922aed99e2813c468cdc63dd6dc765671f3ac42b1872"} Feb 16 19:44:27 crc kubenswrapper[4675]: I0216 19:44:27.495455 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jzrxx" event={"ID":"8c3f1a44-2a72-4614-a4fb-979e49d99e2d","Type":"ContainerStarted","Data":"dca1a94e5ccb530a722b635e9ee1fe5b6376f517e2128714f6889ce1fa640114"} Feb 16 19:44:27 crc kubenswrapper[4675]: I0216 19:44:27.496885 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-944wg" Feb 16 19:44:27 crc kubenswrapper[4675]: I0216 19:44:27.502600 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ctjpc\" (UID: \"2a8b7c95-1c8f-4de9-907c-34b0b0848b13\") " pod="openshift-image-registry/image-registry-697d97f7c8-ctjpc" Feb 16 19:44:27 crc kubenswrapper[4675]: E0216 19:44:27.504429 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 19:44:28.004411496 +0000 UTC m=+151.129701052 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ctjpc" (UID: "2a8b7c95-1c8f-4de9-907c-34b0b0848b13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:27 crc kubenswrapper[4675]: I0216 19:44:27.522006 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6kj4n" event={"ID":"408894c5-c798-48ff-93ac-bc8ea114ee4a","Type":"ContainerStarted","Data":"fb1bd0c01fe0d208c5c17d908853e15d6ee199b812c327ac4ce94792b244add8"} Feb 16 19:44:27 crc kubenswrapper[4675]: I0216 19:44:27.523728 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-6kj4n" Feb 16 19:44:27 crc kubenswrapper[4675]: I0216 19:44:27.526111 4675 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-6kj4n container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" start-of-body= Feb 16 19:44:27 crc kubenswrapper[4675]: I0216 19:44:27.526168 4675 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-6kj4n" podUID="408894c5-c798-48ff-93ac-bc8ea114ee4a" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" Feb 16 19:44:27 crc kubenswrapper[4675]: I0216 19:44:27.526216 4675 patch_prober.go:28] interesting pod/router-default-5444994796-hrp82 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 16 19:44:27 crc kubenswrapper[4675]: [-]has-synced failed: reason withheld Feb 16 19:44:27 crc kubenswrapper[4675]: [+]process-running ok Feb 16 19:44:27 crc kubenswrapper[4675]: healthz check failed Feb 16 19:44:27 crc kubenswrapper[4675]: I0216 19:44:27.526285 4675 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hrp82" podUID="e02c5612-7f04-4ff9-902e-45c4436e01ed" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 16 19:44:27 crc kubenswrapper[4675]: I0216 19:44:27.544360 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jzrxx" podStartSLOduration=128.544336107 podStartE2EDuration="2m8.544336107s" podCreationTimestamp="2026-02-16 19:42:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 19:44:27.542994942 +0000 UTC m=+150.668284498" watchObservedRunningTime="2026-02-16 19:44:27.544336107 +0000 UTC m=+150.669625663" Feb 16 19:44:27 crc kubenswrapper[4675]: I0216 19:44:27.566065 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ncsct" event={"ID":"8c66cb83-e800-414f-a637-18fd6c6423e5","Type":"ContainerStarted","Data":"66ceed61bb70204a0cbc3f584a990da1c7170e5719023a109f58cb60d20e6991"} Feb 16 19:44:27 crc kubenswrapper[4675]: I0216 19:44:27.604047 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 19:44:27 crc kubenswrapper[4675]: E0216 19:44:27.606287 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 19:44:28.106258198 +0000 UTC m=+151.231547754 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:27 crc kubenswrapper[4675]: I0216 19:44:27.616257 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dvv45" Feb 16 19:44:27 crc kubenswrapper[4675]: I0216 19:44:27.665338 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-6kj4n" podStartSLOduration=127.665310033 podStartE2EDuration="2m7.665310033s" podCreationTimestamp="2026-02-16 19:42:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 19:44:27.656190093 +0000 UTC m=+150.781479649" watchObservedRunningTime="2026-02-16 19:44:27.665310033 +0000 UTC m=+150.790599589" Feb 16 19:44:27 crc kubenswrapper[4675]: I0216 19:44:27.671184 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-55l96" Feb 16 19:44:27 crc kubenswrapper[4675]: I0216 19:44:27.707474 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ctjpc\" (UID: \"2a8b7c95-1c8f-4de9-907c-34b0b0848b13\") " pod="openshift-image-registry/image-registry-697d97f7c8-ctjpc" Feb 16 19:44:27 crc kubenswrapper[4675]: E0216 19:44:27.710938 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 19:44:28.210921684 +0000 UTC m=+151.336211240 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ctjpc" (UID: "2a8b7c95-1c8f-4de9-907c-34b0b0848b13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:27 crc kubenswrapper[4675]: I0216 19:44:27.798879 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ncsct" podStartSLOduration=127.798858799 podStartE2EDuration="2m7.798858799s" podCreationTimestamp="2026-02-16 19:42:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 19:44:27.793389506 +0000 UTC m=+150.918679062" watchObservedRunningTime="2026-02-16 19:44:27.798858799 +0000 UTC m=+150.924148355" Feb 16 19:44:27 crc kubenswrapper[4675]: I0216 19:44:27.812493 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 19:44:27 crc kubenswrapper[4675]: E0216 19:44:27.812591 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 19:44:28.312570091 +0000 UTC m=+151.437859647 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:27 crc kubenswrapper[4675]: I0216 19:44:27.817674 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ctjpc\" (UID: \"2a8b7c95-1c8f-4de9-907c-34b0b0848b13\") " pod="openshift-image-registry/image-registry-697d97f7c8-ctjpc" Feb 16 19:44:27 crc kubenswrapper[4675]: E0216 19:44:27.818273 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 19:44:28.31825056 +0000 UTC m=+151.443540116 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ctjpc" (UID: "2a8b7c95-1c8f-4de9-907c-34b0b0848b13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:27 crc kubenswrapper[4675]: I0216 19:44:27.921613 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 19:44:27 crc kubenswrapper[4675]: E0216 19:44:27.922182 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 19:44:28.422161026 +0000 UTC m=+151.547450582 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:27 crc kubenswrapper[4675]: I0216 19:44:27.982254 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 16 19:44:27 crc kubenswrapper[4675]: I0216 19:44:27.982965 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 16 19:44:27 crc kubenswrapper[4675]: I0216 19:44:27.989672 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 16 19:44:27 crc kubenswrapper[4675]: I0216 19:44:27.990035 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 16 19:44:28 crc kubenswrapper[4675]: I0216 19:44:28.006691 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 16 19:44:28 crc kubenswrapper[4675]: I0216 19:44:28.048855 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ctjpc\" (UID: \"2a8b7c95-1c8f-4de9-907c-34b0b0848b13\") " pod="openshift-image-registry/image-registry-697d97f7c8-ctjpc" Feb 16 19:44:28 crc kubenswrapper[4675]: E0216 19:44:28.049792 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 19:44:28.549775437 +0000 UTC m=+151.675064993 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ctjpc" (UID: "2a8b7c95-1c8f-4de9-907c-34b0b0848b13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:28 crc kubenswrapper[4675]: I0216 19:44:28.150405 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 19:44:28 crc kubenswrapper[4675]: I0216 19:44:28.151388 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/66b8d4ad-0bb4-4d74-be6c-fde055dd85d6-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"66b8d4ad-0bb4-4d74-be6c-fde055dd85d6\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 16 19:44:28 crc kubenswrapper[4675]: I0216 19:44:28.151430 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/66b8d4ad-0bb4-4d74-be6c-fde055dd85d6-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"66b8d4ad-0bb4-4d74-be6c-fde055dd85d6\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 16 19:44:28 crc kubenswrapper[4675]: E0216 19:44:28.151643 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 19:44:28.651617759 +0000 UTC m=+151.776907315 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:28 crc kubenswrapper[4675]: I0216 19:44:28.256981 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/66b8d4ad-0bb4-4d74-be6c-fde055dd85d6-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"66b8d4ad-0bb4-4d74-be6c-fde055dd85d6\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 16 19:44:28 crc kubenswrapper[4675]: I0216 19:44:28.257031 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/66b8d4ad-0bb4-4d74-be6c-fde055dd85d6-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"66b8d4ad-0bb4-4d74-be6c-fde055dd85d6\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 16 19:44:28 crc kubenswrapper[4675]: I0216 19:44:28.257055 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ctjpc\" (UID: \"2a8b7c95-1c8f-4de9-907c-34b0b0848b13\") " pod="openshift-image-registry/image-registry-697d97f7c8-ctjpc" Feb 16 19:44:28 crc kubenswrapper[4675]: E0216 19:44:28.257519 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 19:44:28.757502297 +0000 UTC m=+151.882791853 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ctjpc" (UID: "2a8b7c95-1c8f-4de9-907c-34b0b0848b13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:28 crc kubenswrapper[4675]: I0216 19:44:28.257762 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/66b8d4ad-0bb4-4d74-be6c-fde055dd85d6-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"66b8d4ad-0bb4-4d74-be6c-fde055dd85d6\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 16 19:44:28 crc kubenswrapper[4675]: W0216 19:44:28.293751 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-d276ca2856559c059eb97e8b29d5b965586f6cdcd3171e33074733405f91695a WatchSource:0}: Error finding container d276ca2856559c059eb97e8b29d5b965586f6cdcd3171e33074733405f91695a: Status 404 returned error can't find the container with id d276ca2856559c059eb97e8b29d5b965586f6cdcd3171e33074733405f91695a Feb 16 19:44:28 crc kubenswrapper[4675]: I0216 19:44:28.305680 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/66b8d4ad-0bb4-4d74-be6c-fde055dd85d6-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"66b8d4ad-0bb4-4d74-be6c-fde055dd85d6\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 16 19:44:28 crc kubenswrapper[4675]: I0216 19:44:28.359959 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 19:44:28 crc kubenswrapper[4675]: E0216 19:44:28.360572 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 19:44:28.86051931 +0000 UTC m=+151.985808866 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:28 crc kubenswrapper[4675]: I0216 19:44:28.399536 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 16 19:44:28 crc kubenswrapper[4675]: I0216 19:44:28.464052 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ctjpc\" (UID: \"2a8b7c95-1c8f-4de9-907c-34b0b0848b13\") " pod="openshift-image-registry/image-registry-697d97f7c8-ctjpc" Feb 16 19:44:28 crc kubenswrapper[4675]: E0216 19:44:28.464642 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 19:44:28.964620411 +0000 UTC m=+152.089909967 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ctjpc" (UID: "2a8b7c95-1c8f-4de9-907c-34b0b0848b13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:28 crc kubenswrapper[4675]: I0216 19:44:28.514209 4675 patch_prober.go:28] interesting pod/router-default-5444994796-hrp82 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 16 19:44:28 crc kubenswrapper[4675]: [-]has-synced failed: reason withheld Feb 16 19:44:28 crc kubenswrapper[4675]: [+]process-running ok Feb 16 19:44:28 crc kubenswrapper[4675]: healthz check failed Feb 16 19:44:28 crc kubenswrapper[4675]: I0216 19:44:28.514309 4675 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hrp82" podUID="e02c5612-7f04-4ff9-902e-45c4436e01ed" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 16 19:44:28 crc kubenswrapper[4675]: I0216 19:44:28.568882 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 19:44:28 crc kubenswrapper[4675]: E0216 19:44:28.569083 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 19:44:29.069055491 +0000 UTC m=+152.194345047 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:28 crc kubenswrapper[4675]: I0216 19:44:28.569144 4675 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-dttlz container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.35:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 16 19:44:28 crc kubenswrapper[4675]: I0216 19:44:28.569190 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ctjpc\" (UID: \"2a8b7c95-1c8f-4de9-907c-34b0b0848b13\") " pod="openshift-image-registry/image-registry-697d97f7c8-ctjpc" Feb 16 19:44:28 crc kubenswrapper[4675]: I0216 19:44:28.569184 4675 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dttlz" podUID="4178aeb7-6531-47e8-bf0d-695fbb18bc89" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.35:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 16 19:44:28 crc kubenswrapper[4675]: E0216 19:44:28.569673 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 19:44:29.069655307 +0000 UTC m=+152.194944863 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ctjpc" (UID: "2a8b7c95-1c8f-4de9-907c-34b0b0848b13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:28 crc kubenswrapper[4675]: I0216 19:44:28.570576 4675 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-p2j88 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.42:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 16 19:44:28 crc kubenswrapper[4675]: I0216 19:44:28.570606 4675 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-p2j88" podUID="2d280f5d-e193-45f2-8422-fa0a4177c833" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.42:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 16 19:44:28 crc kubenswrapper[4675]: I0216 19:44:28.645122 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-s7kmk" event={"ID":"76908270-f3ef-412b-a18d-065cf006461e","Type":"ContainerStarted","Data":"81c3d3ca2d6878221903c7187c7ad070b6d11286f65cf2c2645194580f9e1058"} Feb 16 19:44:28 crc kubenswrapper[4675]: I0216 19:44:28.665456 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"d276ca2856559c059eb97e8b29d5b965586f6cdcd3171e33074733405f91695a"} Feb 16 19:44:28 crc kubenswrapper[4675]: I0216 19:44:28.670310 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 19:44:28 crc kubenswrapper[4675]: E0216 19:44:28.670620 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 19:44:29.170603635 +0000 UTC m=+152.295893191 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:28 crc kubenswrapper[4675]: I0216 19:44:28.684540 4675 generic.go:334] "Generic (PLEG): container finished" podID="7a536c24-ac8f-4add-96e8-064bbcd40ba6" containerID="d92a1b6d8f41970b526ab4f11e1507cd77a28a49f971c7a70d47d1aa509a02c5" exitCode=0 Feb 16 19:44:28 crc kubenswrapper[4675]: I0216 19:44:28.684621 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29521170-6rzxb" event={"ID":"7a536c24-ac8f-4add-96e8-064bbcd40ba6","Type":"ContainerDied","Data":"d92a1b6d8f41970b526ab4f11e1507cd77a28a49f971c7a70d47d1aa509a02c5"} Feb 16 19:44:28 crc kubenswrapper[4675]: I0216 19:44:28.705209 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"dd08a240ada56f4bfb0095d565ee18c61d56c458785931632d19fb67b28a1d2d"} Feb 16 19:44:28 crc kubenswrapper[4675]: I0216 19:44:28.746909 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"fa354d3368b3ed2955bd80f16ddb009037b8521c7b83e30b04de027efe5a928e"} Feb 16 19:44:28 crc kubenswrapper[4675]: I0216 19:44:28.747961 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"2670f2239b85627c756c330cd8d540656f7b0430f7bb07155fefa1490124a91c"} Feb 16 19:44:28 crc kubenswrapper[4675]: I0216 19:44:28.749403 4675 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-6kj4n container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" start-of-body= Feb 16 19:44:28 crc kubenswrapper[4675]: I0216 19:44:28.749484 4675 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-6kj4n" podUID="408894c5-c798-48ff-93ac-bc8ea114ee4a" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" Feb 16 19:44:28 crc kubenswrapper[4675]: I0216 19:44:28.773529 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ctjpc\" (UID: \"2a8b7c95-1c8f-4de9-907c-34b0b0848b13\") " pod="openshift-image-registry/image-registry-697d97f7c8-ctjpc" Feb 16 19:44:28 crc kubenswrapper[4675]: E0216 19:44:28.776650 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 19:44:29.276634537 +0000 UTC m=+152.401924103 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ctjpc" (UID: "2a8b7c95-1c8f-4de9-907c-34b0b0848b13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:28 crc kubenswrapper[4675]: I0216 19:44:28.777195 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dttlz" Feb 16 19:44:28 crc kubenswrapper[4675]: I0216 19:44:28.821992 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-s7kmk" podStartSLOduration=129.821969731 podStartE2EDuration="2m9.821969731s" podCreationTimestamp="2026-02-16 19:42:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 19:44:28.716651858 +0000 UTC m=+151.841941414" watchObservedRunningTime="2026-02-16 19:44:28.821969731 +0000 UTC m=+151.947259297" Feb 16 19:44:28 crc kubenswrapper[4675]: I0216 19:44:28.878033 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 19:44:28 crc kubenswrapper[4675]: E0216 19:44:28.879940 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 19:44:29.379920107 +0000 UTC m=+152.505209653 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:28 crc kubenswrapper[4675]: I0216 19:44:28.927379 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-grmd4"] Feb 16 19:44:28 crc kubenswrapper[4675]: I0216 19:44:28.928378 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-grmd4" Feb 16 19:44:28 crc kubenswrapper[4675]: W0216 19:44:28.939549 4675 reflector.go:561] object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g": failed to list *v1.Secret: secrets "certified-operators-dockercfg-4rs5g" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-marketplace": no relationship found between node 'crc' and this object Feb 16 19:44:28 crc kubenswrapper[4675]: E0216 19:44:28.939628 4675 reflector.go:158] "Unhandled Error" err="object-\"openshift-marketplace\"/\"certified-operators-dockercfg-4rs5g\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"certified-operators-dockercfg-4rs5g\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-marketplace\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 16 19:44:28 crc kubenswrapper[4675]: I0216 19:44:28.989859 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ctjpc\" (UID: \"2a8b7c95-1c8f-4de9-907c-34b0b0848b13\") " pod="openshift-image-registry/image-registry-697d97f7c8-ctjpc" Feb 16 19:44:28 crc kubenswrapper[4675]: E0216 19:44:28.990972 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 19:44:29.490957041 +0000 UTC m=+152.616246597 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ctjpc" (UID: "2a8b7c95-1c8f-4de9-907c-34b0b0848b13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:29 crc kubenswrapper[4675]: I0216 19:44:29.035939 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-grmd4"] Feb 16 19:44:29 crc kubenswrapper[4675]: I0216 19:44:29.096897 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 19:44:29 crc kubenswrapper[4675]: I0216 19:44:29.097507 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57kvs\" (UniqueName: \"kubernetes.io/projected/ee9cd13e-3dd5-4fbf-8989-965350dff2e8-kube-api-access-57kvs\") pod \"certified-operators-grmd4\" (UID: \"ee9cd13e-3dd5-4fbf-8989-965350dff2e8\") " pod="openshift-marketplace/certified-operators-grmd4" Feb 16 19:44:29 crc kubenswrapper[4675]: I0216 19:44:29.097659 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee9cd13e-3dd5-4fbf-8989-965350dff2e8-catalog-content\") pod \"certified-operators-grmd4\" (UID: \"ee9cd13e-3dd5-4fbf-8989-965350dff2e8\") " pod="openshift-marketplace/certified-operators-grmd4" Feb 16 19:44:29 crc kubenswrapper[4675]: I0216 19:44:29.097826 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee9cd13e-3dd5-4fbf-8989-965350dff2e8-utilities\") pod \"certified-operators-grmd4\" (UID: \"ee9cd13e-3dd5-4fbf-8989-965350dff2e8\") " pod="openshift-marketplace/certified-operators-grmd4" Feb 16 19:44:29 crc kubenswrapper[4675]: E0216 19:44:29.098055 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 19:44:29.59801823 +0000 UTC m=+152.723307776 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:29 crc kubenswrapper[4675]: I0216 19:44:29.098840 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-p2j88" Feb 16 19:44:29 crc kubenswrapper[4675]: I0216 19:44:29.100298 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-h6msp"] Feb 16 19:44:29 crc kubenswrapper[4675]: I0216 19:44:29.101550 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h6msp" Feb 16 19:44:29 crc kubenswrapper[4675]: I0216 19:44:29.106177 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 16 19:44:29 crc kubenswrapper[4675]: I0216 19:44:29.153831 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-h6msp"] Feb 16 19:44:29 crc kubenswrapper[4675]: I0216 19:44:29.204641 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57kvs\" (UniqueName: \"kubernetes.io/projected/ee9cd13e-3dd5-4fbf-8989-965350dff2e8-kube-api-access-57kvs\") pod \"certified-operators-grmd4\" (UID: \"ee9cd13e-3dd5-4fbf-8989-965350dff2e8\") " pod="openshift-marketplace/certified-operators-grmd4" Feb 16 19:44:29 crc kubenswrapper[4675]: I0216 19:44:29.204716 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb2e7e26-8ed8-4cec-8f78-7d02cce6bb33-utilities\") pod \"community-operators-h6msp\" (UID: \"eb2e7e26-8ed8-4cec-8f78-7d02cce6bb33\") " pod="openshift-marketplace/community-operators-h6msp" Feb 16 19:44:29 crc kubenswrapper[4675]: I0216 19:44:29.204743 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee9cd13e-3dd5-4fbf-8989-965350dff2e8-catalog-content\") pod \"certified-operators-grmd4\" (UID: \"ee9cd13e-3dd5-4fbf-8989-965350dff2e8\") " pod="openshift-marketplace/certified-operators-grmd4" Feb 16 19:44:29 crc kubenswrapper[4675]: I0216 19:44:29.204768 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ctjpc\" (UID: \"2a8b7c95-1c8f-4de9-907c-34b0b0848b13\") " pod="openshift-image-registry/image-registry-697d97f7c8-ctjpc" Feb 16 19:44:29 crc kubenswrapper[4675]: I0216 19:44:29.204830 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb2e7e26-8ed8-4cec-8f78-7d02cce6bb33-catalog-content\") pod \"community-operators-h6msp\" (UID: \"eb2e7e26-8ed8-4cec-8f78-7d02cce6bb33\") " pod="openshift-marketplace/community-operators-h6msp" Feb 16 19:44:29 crc kubenswrapper[4675]: I0216 19:44:29.204890 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee9cd13e-3dd5-4fbf-8989-965350dff2e8-utilities\") pod \"certified-operators-grmd4\" (UID: \"ee9cd13e-3dd5-4fbf-8989-965350dff2e8\") " pod="openshift-marketplace/certified-operators-grmd4" Feb 16 19:44:29 crc kubenswrapper[4675]: I0216 19:44:29.204917 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5q59\" (UniqueName: \"kubernetes.io/projected/eb2e7e26-8ed8-4cec-8f78-7d02cce6bb33-kube-api-access-x5q59\") pod \"community-operators-h6msp\" (UID: \"eb2e7e26-8ed8-4cec-8f78-7d02cce6bb33\") " pod="openshift-marketplace/community-operators-h6msp" Feb 16 19:44:29 crc kubenswrapper[4675]: I0216 19:44:29.205772 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee9cd13e-3dd5-4fbf-8989-965350dff2e8-catalog-content\") pod \"certified-operators-grmd4\" (UID: \"ee9cd13e-3dd5-4fbf-8989-965350dff2e8\") " pod="openshift-marketplace/certified-operators-grmd4" Feb 16 19:44:29 crc kubenswrapper[4675]: E0216 19:44:29.206132 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 19:44:29.706118857 +0000 UTC m=+152.831408413 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ctjpc" (UID: "2a8b7c95-1c8f-4de9-907c-34b0b0848b13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:29 crc kubenswrapper[4675]: I0216 19:44:29.206392 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee9cd13e-3dd5-4fbf-8989-965350dff2e8-utilities\") pod \"certified-operators-grmd4\" (UID: \"ee9cd13e-3dd5-4fbf-8989-965350dff2e8\") " pod="openshift-marketplace/certified-operators-grmd4" Feb 16 19:44:29 crc kubenswrapper[4675]: I0216 19:44:29.279013 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57kvs\" (UniqueName: \"kubernetes.io/projected/ee9cd13e-3dd5-4fbf-8989-965350dff2e8-kube-api-access-57kvs\") pod \"certified-operators-grmd4\" (UID: \"ee9cd13e-3dd5-4fbf-8989-965350dff2e8\") " pod="openshift-marketplace/certified-operators-grmd4" Feb 16 19:44:29 crc kubenswrapper[4675]: I0216 19:44:29.279106 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gfhrq"] Feb 16 19:44:29 crc kubenswrapper[4675]: I0216 19:44:29.280457 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gfhrq" Feb 16 19:44:29 crc kubenswrapper[4675]: I0216 19:44:29.289245 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gfhrq"] Feb 16 19:44:29 crc kubenswrapper[4675]: I0216 19:44:29.307584 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 19:44:29 crc kubenswrapper[4675]: I0216 19:44:29.307797 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb2e7e26-8ed8-4cec-8f78-7d02cce6bb33-catalog-content\") pod \"community-operators-h6msp\" (UID: \"eb2e7e26-8ed8-4cec-8f78-7d02cce6bb33\") " pod="openshift-marketplace/community-operators-h6msp" Feb 16 19:44:29 crc kubenswrapper[4675]: I0216 19:44:29.307839 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5q59\" (UniqueName: \"kubernetes.io/projected/eb2e7e26-8ed8-4cec-8f78-7d02cce6bb33-kube-api-access-x5q59\") pod \"community-operators-h6msp\" (UID: \"eb2e7e26-8ed8-4cec-8f78-7d02cce6bb33\") " pod="openshift-marketplace/community-operators-h6msp" Feb 16 19:44:29 crc kubenswrapper[4675]: I0216 19:44:29.307886 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb2e7e26-8ed8-4cec-8f78-7d02cce6bb33-utilities\") pod \"community-operators-h6msp\" (UID: \"eb2e7e26-8ed8-4cec-8f78-7d02cce6bb33\") " pod="openshift-marketplace/community-operators-h6msp" Feb 16 19:44:29 crc kubenswrapper[4675]: E0216 19:44:29.308028 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 19:44:29.807998209 +0000 UTC m=+152.933287765 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:29 crc kubenswrapper[4675]: I0216 19:44:29.308409 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb2e7e26-8ed8-4cec-8f78-7d02cce6bb33-utilities\") pod \"community-operators-h6msp\" (UID: \"eb2e7e26-8ed8-4cec-8f78-7d02cce6bb33\") " pod="openshift-marketplace/community-operators-h6msp" Feb 16 19:44:29 crc kubenswrapper[4675]: I0216 19:44:29.319348 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb2e7e26-8ed8-4cec-8f78-7d02cce6bb33-catalog-content\") pod \"community-operators-h6msp\" (UID: \"eb2e7e26-8ed8-4cec-8f78-7d02cce6bb33\") " pod="openshift-marketplace/community-operators-h6msp" Feb 16 19:44:29 crc kubenswrapper[4675]: I0216 19:44:29.373531 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5q59\" (UniqueName: \"kubernetes.io/projected/eb2e7e26-8ed8-4cec-8f78-7d02cce6bb33-kube-api-access-x5q59\") pod \"community-operators-h6msp\" (UID: \"eb2e7e26-8ed8-4cec-8f78-7d02cce6bb33\") " pod="openshift-marketplace/community-operators-h6msp" Feb 16 19:44:29 crc kubenswrapper[4675]: I0216 19:44:29.409490 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1ac438d-9403-455d-869e-7d9cc34cf15f-utilities\") pod \"certified-operators-gfhrq\" (UID: \"c1ac438d-9403-455d-869e-7d9cc34cf15f\") " pod="openshift-marketplace/certified-operators-gfhrq" Feb 16 19:44:29 crc kubenswrapper[4675]: I0216 19:44:29.409580 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ctjpc\" (UID: \"2a8b7c95-1c8f-4de9-907c-34b0b0848b13\") " pod="openshift-image-registry/image-registry-697d97f7c8-ctjpc" Feb 16 19:44:29 crc kubenswrapper[4675]: I0216 19:44:29.409600 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1ac438d-9403-455d-869e-7d9cc34cf15f-catalog-content\") pod \"certified-operators-gfhrq\" (UID: \"c1ac438d-9403-455d-869e-7d9cc34cf15f\") " pod="openshift-marketplace/certified-operators-gfhrq" Feb 16 19:44:29 crc kubenswrapper[4675]: I0216 19:44:29.409627 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgvmx\" (UniqueName: \"kubernetes.io/projected/c1ac438d-9403-455d-869e-7d9cc34cf15f-kube-api-access-kgvmx\") pod \"certified-operators-gfhrq\" (UID: \"c1ac438d-9403-455d-869e-7d9cc34cf15f\") " pod="openshift-marketplace/certified-operators-gfhrq" Feb 16 19:44:29 crc kubenswrapper[4675]: E0216 19:44:29.410188 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 19:44:29.91016522 +0000 UTC m=+153.035454776 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ctjpc" (UID: "2a8b7c95-1c8f-4de9-907c-34b0b0848b13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:29 crc kubenswrapper[4675]: I0216 19:44:29.420704 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bbn8r"] Feb 16 19:44:29 crc kubenswrapper[4675]: I0216 19:44:29.421980 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bbn8r" Feb 16 19:44:29 crc kubenswrapper[4675]: I0216 19:44:29.432062 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h6msp" Feb 16 19:44:29 crc kubenswrapper[4675]: I0216 19:44:29.443512 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 16 19:44:29 crc kubenswrapper[4675]: I0216 19:44:29.444991 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bbn8r"] Feb 16 19:44:29 crc kubenswrapper[4675]: W0216 19:44:29.497821 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod66b8d4ad_0bb4_4d74_be6c_fde055dd85d6.slice/crio-954d68788424d901ae8ded8e62fb128de11f3a5cea5af0d0ff8c083bfbc218c5 WatchSource:0}: Error finding container 954d68788424d901ae8ded8e62fb128de11f3a5cea5af0d0ff8c083bfbc218c5: Status 404 returned error can't find the container with id 954d68788424d901ae8ded8e62fb128de11f3a5cea5af0d0ff8c083bfbc218c5 Feb 16 19:44:29 crc kubenswrapper[4675]: I0216 19:44:29.510309 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 19:44:29 crc kubenswrapper[4675]: I0216 19:44:29.510484 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edd3f360-2ab8-4310-ac1a-3473a48bd5ae-catalog-content\") pod \"community-operators-bbn8r\" (UID: \"edd3f360-2ab8-4310-ac1a-3473a48bd5ae\") " pod="openshift-marketplace/community-operators-bbn8r" Feb 16 19:44:29 crc kubenswrapper[4675]: I0216 19:44:29.510514 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccc57\" (UniqueName: \"kubernetes.io/projected/edd3f360-2ab8-4310-ac1a-3473a48bd5ae-kube-api-access-ccc57\") pod \"community-operators-bbn8r\" (UID: \"edd3f360-2ab8-4310-ac1a-3473a48bd5ae\") " pod="openshift-marketplace/community-operators-bbn8r" Feb 16 19:44:29 crc kubenswrapper[4675]: I0216 19:44:29.510539 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1ac438d-9403-455d-869e-7d9cc34cf15f-utilities\") pod \"certified-operators-gfhrq\" (UID: \"c1ac438d-9403-455d-869e-7d9cc34cf15f\") " pod="openshift-marketplace/certified-operators-gfhrq" Feb 16 19:44:29 crc kubenswrapper[4675]: I0216 19:44:29.510590 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edd3f360-2ab8-4310-ac1a-3473a48bd5ae-utilities\") pod \"community-operators-bbn8r\" (UID: \"edd3f360-2ab8-4310-ac1a-3473a48bd5ae\") " pod="openshift-marketplace/community-operators-bbn8r" Feb 16 19:44:29 crc kubenswrapper[4675]: I0216 19:44:29.510628 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1ac438d-9403-455d-869e-7d9cc34cf15f-catalog-content\") pod \"certified-operators-gfhrq\" (UID: \"c1ac438d-9403-455d-869e-7d9cc34cf15f\") " pod="openshift-marketplace/certified-operators-gfhrq" Feb 16 19:44:29 crc kubenswrapper[4675]: I0216 19:44:29.510656 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgvmx\" (UniqueName: \"kubernetes.io/projected/c1ac438d-9403-455d-869e-7d9cc34cf15f-kube-api-access-kgvmx\") pod \"certified-operators-gfhrq\" (UID: \"c1ac438d-9403-455d-869e-7d9cc34cf15f\") " pod="openshift-marketplace/certified-operators-gfhrq" Feb 16 19:44:29 crc kubenswrapper[4675]: E0216 19:44:29.511030 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 19:44:30.011016085 +0000 UTC m=+153.136305631 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:29 crc kubenswrapper[4675]: I0216 19:44:29.512164 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1ac438d-9403-455d-869e-7d9cc34cf15f-utilities\") pod \"certified-operators-gfhrq\" (UID: \"c1ac438d-9403-455d-869e-7d9cc34cf15f\") " pod="openshift-marketplace/certified-operators-gfhrq" Feb 16 19:44:29 crc kubenswrapper[4675]: I0216 19:44:29.512277 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1ac438d-9403-455d-869e-7d9cc34cf15f-catalog-content\") pod \"certified-operators-gfhrq\" (UID: \"c1ac438d-9403-455d-869e-7d9cc34cf15f\") " pod="openshift-marketplace/certified-operators-gfhrq" Feb 16 19:44:29 crc kubenswrapper[4675]: I0216 19:44:29.516524 4675 patch_prober.go:28] interesting pod/router-default-5444994796-hrp82 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 16 19:44:29 crc kubenswrapper[4675]: [-]has-synced failed: reason withheld Feb 16 19:44:29 crc kubenswrapper[4675]: [+]process-running ok Feb 16 19:44:29 crc kubenswrapper[4675]: healthz check failed Feb 16 19:44:29 crc kubenswrapper[4675]: I0216 19:44:29.516604 4675 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hrp82" podUID="e02c5612-7f04-4ff9-902e-45c4436e01ed" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 16 19:44:29 crc kubenswrapper[4675]: I0216 19:44:29.551407 4675 csr.go:261] certificate signing request csr-b4ww5 is approved, waiting to be issued Feb 16 19:44:29 crc kubenswrapper[4675]: I0216 19:44:29.561869 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgvmx\" (UniqueName: \"kubernetes.io/projected/c1ac438d-9403-455d-869e-7d9cc34cf15f-kube-api-access-kgvmx\") pod \"certified-operators-gfhrq\" (UID: \"c1ac438d-9403-455d-869e-7d9cc34cf15f\") " pod="openshift-marketplace/certified-operators-gfhrq" Feb 16 19:44:29 crc kubenswrapper[4675]: I0216 19:44:29.562375 4675 csr.go:257] certificate signing request csr-b4ww5 is issued Feb 16 19:44:29 crc kubenswrapper[4675]: I0216 19:44:29.611615 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edd3f360-2ab8-4310-ac1a-3473a48bd5ae-utilities\") pod \"community-operators-bbn8r\" (UID: \"edd3f360-2ab8-4310-ac1a-3473a48bd5ae\") " pod="openshift-marketplace/community-operators-bbn8r" Feb 16 19:44:29 crc kubenswrapper[4675]: I0216 19:44:29.611677 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ctjpc\" (UID: \"2a8b7c95-1c8f-4de9-907c-34b0b0848b13\") " pod="openshift-image-registry/image-registry-697d97f7c8-ctjpc" Feb 16 19:44:29 crc kubenswrapper[4675]: I0216 19:44:29.611730 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edd3f360-2ab8-4310-ac1a-3473a48bd5ae-catalog-content\") pod \"community-operators-bbn8r\" (UID: \"edd3f360-2ab8-4310-ac1a-3473a48bd5ae\") " pod="openshift-marketplace/community-operators-bbn8r" Feb 16 19:44:29 crc kubenswrapper[4675]: I0216 19:44:29.611755 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccc57\" (UniqueName: \"kubernetes.io/projected/edd3f360-2ab8-4310-ac1a-3473a48bd5ae-kube-api-access-ccc57\") pod \"community-operators-bbn8r\" (UID: \"edd3f360-2ab8-4310-ac1a-3473a48bd5ae\") " pod="openshift-marketplace/community-operators-bbn8r" Feb 16 19:44:29 crc kubenswrapper[4675]: E0216 19:44:29.612262 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 19:44:30.112236061 +0000 UTC m=+153.237525617 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ctjpc" (UID: "2a8b7c95-1c8f-4de9-907c-34b0b0848b13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:29 crc kubenswrapper[4675]: I0216 19:44:29.612498 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edd3f360-2ab8-4310-ac1a-3473a48bd5ae-utilities\") pod \"community-operators-bbn8r\" (UID: \"edd3f360-2ab8-4310-ac1a-3473a48bd5ae\") " pod="openshift-marketplace/community-operators-bbn8r" Feb 16 19:44:29 crc kubenswrapper[4675]: I0216 19:44:29.612631 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edd3f360-2ab8-4310-ac1a-3473a48bd5ae-catalog-content\") pod \"community-operators-bbn8r\" (UID: \"edd3f360-2ab8-4310-ac1a-3473a48bd5ae\") " pod="openshift-marketplace/community-operators-bbn8r" Feb 16 19:44:29 crc kubenswrapper[4675]: I0216 19:44:29.637549 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccc57\" (UniqueName: \"kubernetes.io/projected/edd3f360-2ab8-4310-ac1a-3473a48bd5ae-kube-api-access-ccc57\") pod \"community-operators-bbn8r\" (UID: \"edd3f360-2ab8-4310-ac1a-3473a48bd5ae\") " pod="openshift-marketplace/community-operators-bbn8r" Feb 16 19:44:29 crc kubenswrapper[4675]: I0216 19:44:29.712762 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 19:44:29 crc kubenswrapper[4675]: E0216 19:44:29.713032 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 19:44:30.212990504 +0000 UTC m=+153.338280060 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:29 crc kubenswrapper[4675]: I0216 19:44:29.713131 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ctjpc\" (UID: \"2a8b7c95-1c8f-4de9-907c-34b0b0848b13\") " pod="openshift-image-registry/image-registry-697d97f7c8-ctjpc" Feb 16 19:44:29 crc kubenswrapper[4675]: E0216 19:44:29.713642 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 19:44:30.213622991 +0000 UTC m=+153.338912547 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ctjpc" (UID: "2a8b7c95-1c8f-4de9-907c-34b0b0848b13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:29 crc kubenswrapper[4675]: I0216 19:44:29.738629 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bbn8r" Feb 16 19:44:29 crc kubenswrapper[4675]: I0216 19:44:29.767785 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 16 19:44:29 crc kubenswrapper[4675]: I0216 19:44:29.771753 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"55035fbec7c39480e929e1ae36003312d919667a00be90bdb71b55695b6fd01d"} Feb 16 19:44:29 crc kubenswrapper[4675]: I0216 19:44:29.774773 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-grmd4" Feb 16 19:44:29 crc kubenswrapper[4675]: I0216 19:44:29.775498 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gfhrq" Feb 16 19:44:29 crc kubenswrapper[4675]: I0216 19:44:29.783523 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"66b8d4ad-0bb4-4d74-be6c-fde055dd85d6","Type":"ContainerStarted","Data":"954d68788424d901ae8ded8e62fb128de11f3a5cea5af0d0ff8c083bfbc218c5"} Feb 16 19:44:29 crc kubenswrapper[4675]: I0216 19:44:29.805962 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-vjgs5" event={"ID":"b3876426-ba16-40ed-a0f8-aeb0cdb9ae0f","Type":"ContainerStarted","Data":"ecbd5dc6810e951351ff7111bf619ff2bfd988db7c5fa4970bd6af4080d8e5ba"} Feb 16 19:44:29 crc kubenswrapper[4675]: I0216 19:44:29.814620 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 19:44:29 crc kubenswrapper[4675]: E0216 19:44:29.815039 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 19:44:30.315024291 +0000 UTC m=+153.440313847 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:29 crc kubenswrapper[4675]: I0216 19:44:29.839097 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"0c8b87db6e4409d5db29ec71bc4944c173003cbf918075c6b1feced9cb73eca4"} Feb 16 19:44:29 crc kubenswrapper[4675]: I0216 19:44:29.840591 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 19:44:29 crc kubenswrapper[4675]: I0216 19:44:29.848992 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-6kj4n" Feb 16 19:44:29 crc kubenswrapper[4675]: I0216 19:44:29.932857 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ctjpc\" (UID: \"2a8b7c95-1c8f-4de9-907c-34b0b0848b13\") " pod="openshift-image-registry/image-registry-697d97f7c8-ctjpc" Feb 16 19:44:29 crc kubenswrapper[4675]: E0216 19:44:29.935033 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 19:44:30.435016971 +0000 UTC m=+153.560306527 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ctjpc" (UID: "2a8b7c95-1c8f-4de9-907c-34b0b0848b13") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:30 crc kubenswrapper[4675]: I0216 19:44:30.006963 4675 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Feb 16 19:44:30 crc kubenswrapper[4675]: I0216 19:44:30.037308 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 19:44:30 crc kubenswrapper[4675]: E0216 19:44:30.037797 4675 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 19:44:30.537774006 +0000 UTC m=+153.663063562 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 19:44:30 crc kubenswrapper[4675]: I0216 19:44:30.054828 4675 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-02-16T19:44:30.007249762Z","Handler":null,"Name":""} Feb 16 19:44:30 crc kubenswrapper[4675]: I0216 19:44:30.068333 4675 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Feb 16 19:44:30 crc kubenswrapper[4675]: I0216 19:44:30.068397 4675 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Feb 16 19:44:30 crc kubenswrapper[4675]: I0216 19:44:30.142603 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ctjpc\" (UID: \"2a8b7c95-1c8f-4de9-907c-34b0b0848b13\") " pod="openshift-image-registry/image-registry-697d97f7c8-ctjpc" Feb 16 19:44:30 crc kubenswrapper[4675]: I0216 19:44:30.177286 4675 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 16 19:44:30 crc kubenswrapper[4675]: I0216 19:44:30.177334 4675 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ctjpc\" (UID: \"2a8b7c95-1c8f-4de9-907c-34b0b0848b13\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-ctjpc" Feb 16 19:44:30 crc kubenswrapper[4675]: I0216 19:44:30.248214 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-h6msp"] Feb 16 19:44:30 crc kubenswrapper[4675]: I0216 19:44:30.260342 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 16 19:44:30 crc kubenswrapper[4675]: I0216 19:44:30.261988 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 16 19:44:30 crc kubenswrapper[4675]: I0216 19:44:30.280137 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Feb 16 19:44:30 crc kubenswrapper[4675]: I0216 19:44:30.280341 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Feb 16 19:44:30 crc kubenswrapper[4675]: I0216 19:44:30.286915 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 16 19:44:30 crc kubenswrapper[4675]: I0216 19:44:30.351598 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6eaac5e7-0e5c-421c-9805-d2e10271af87-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"6eaac5e7-0e5c-421c-9805-d2e10271af87\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 16 19:44:30 crc kubenswrapper[4675]: I0216 19:44:30.351642 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6eaac5e7-0e5c-421c-9805-d2e10271af87-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"6eaac5e7-0e5c-421c-9805-d2e10271af87\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 16 19:44:30 crc kubenswrapper[4675]: I0216 19:44:30.376518 4675 patch_prober.go:28] interesting pod/downloads-7954f5f757-6k7j5 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.33:8080/\": dial tcp 10.217.0.33:8080: connect: connection refused" start-of-body= Feb 16 19:44:30 crc kubenswrapper[4675]: I0216 19:44:30.376584 4675 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-6k7j5" podUID="5d5f11f1-66e4-4700-9c37-f3843d1769a5" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.33:8080/\": dial tcp 10.217.0.33:8080: connect: connection refused" Feb 16 19:44:30 crc kubenswrapper[4675]: I0216 19:44:30.377042 4675 patch_prober.go:28] interesting pod/downloads-7954f5f757-6k7j5 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.33:8080/\": dial tcp 10.217.0.33:8080: connect: connection refused" start-of-body= Feb 16 19:44:30 crc kubenswrapper[4675]: I0216 19:44:30.377075 4675 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-6k7j5" podUID="5d5f11f1-66e4-4700-9c37-f3843d1769a5" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.33:8080/\": dial tcp 10.217.0.33:8080: connect: connection refused" Feb 16 19:44:30 crc kubenswrapper[4675]: I0216 19:44:30.406442 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ctjpc\" (UID: \"2a8b7c95-1c8f-4de9-907c-34b0b0848b13\") " pod="openshift-image-registry/image-registry-697d97f7c8-ctjpc" Feb 16 19:44:30 crc kubenswrapper[4675]: I0216 19:44:30.411201 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-chc9w" Feb 16 19:44:30 crc kubenswrapper[4675]: I0216 19:44:30.411242 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-chc9w" Feb 16 19:44:30 crc kubenswrapper[4675]: I0216 19:44:30.416504 4675 patch_prober.go:28] interesting pod/console-f9d7485db-chc9w container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.10:8443/health\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Feb 16 19:44:30 crc kubenswrapper[4675]: I0216 19:44:30.416582 4675 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-chc9w" podUID="061097a9-32e9-4972-989a-f3777a41bc2b" containerName="console" probeResult="failure" output="Get \"https://10.217.0.10:8443/health\": dial tcp 10.217.0.10:8443: connect: connection refused" Feb 16 19:44:30 crc kubenswrapper[4675]: I0216 19:44:30.419270 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-ctjpc" Feb 16 19:44:30 crc kubenswrapper[4675]: I0216 19:44:30.441903 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-grmd4"] Feb 16 19:44:30 crc kubenswrapper[4675]: I0216 19:44:30.464233 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 19:44:30 crc kubenswrapper[4675]: I0216 19:44:30.464505 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6eaac5e7-0e5c-421c-9805-d2e10271af87-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"6eaac5e7-0e5c-421c-9805-d2e10271af87\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 16 19:44:30 crc kubenswrapper[4675]: I0216 19:44:30.464542 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6eaac5e7-0e5c-421c-9805-d2e10271af87-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"6eaac5e7-0e5c-421c-9805-d2e10271af87\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 16 19:44:30 crc kubenswrapper[4675]: I0216 19:44:30.465232 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6eaac5e7-0e5c-421c-9805-d2e10271af87-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"6eaac5e7-0e5c-421c-9805-d2e10271af87\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 16 19:44:30 crc kubenswrapper[4675]: I0216 19:44:30.502920 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6eaac5e7-0e5c-421c-9805-d2e10271af87-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"6eaac5e7-0e5c-421c-9805-d2e10271af87\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 16 19:44:30 crc kubenswrapper[4675]: I0216 19:44:30.520653 4675 patch_prober.go:28] interesting pod/router-default-5444994796-hrp82 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 16 19:44:30 crc kubenswrapper[4675]: [-]has-synced failed: reason withheld Feb 16 19:44:30 crc kubenswrapper[4675]: [+]process-running ok Feb 16 19:44:30 crc kubenswrapper[4675]: healthz check failed Feb 16 19:44:30 crc kubenswrapper[4675]: I0216 19:44:30.520736 4675 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hrp82" podUID="e02c5612-7f04-4ff9-902e-45c4436e01ed" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 16 19:44:30 crc kubenswrapper[4675]: I0216 19:44:30.564486 4675 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-16 19:39:29 +0000 UTC, rotation deadline is 2026-12-13 06:17:52.561361635 +0000 UTC Feb 16 19:44:30 crc kubenswrapper[4675]: I0216 19:44:30.565039 4675 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7186h33m21.996327734s for next certificate rotation Feb 16 19:44:30 crc kubenswrapper[4675]: I0216 19:44:30.575511 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 16 19:44:30 crc kubenswrapper[4675]: I0216 19:44:30.618029 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 16 19:44:30 crc kubenswrapper[4675]: I0216 19:44:30.652524 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gfhrq"] Feb 16 19:44:30 crc kubenswrapper[4675]: I0216 19:44:30.684962 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bbn8r"] Feb 16 19:44:30 crc kubenswrapper[4675]: I0216 19:44:30.803753 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-s7kmk" Feb 16 19:44:30 crc kubenswrapper[4675]: I0216 19:44:30.804224 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-s7kmk" Feb 16 19:44:30 crc kubenswrapper[4675]: I0216 19:44:30.816114 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fg6bd" Feb 16 19:44:30 crc kubenswrapper[4675]: I0216 19:44:30.816159 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fg6bd" Feb 16 19:44:30 crc kubenswrapper[4675]: I0216 19:44:30.824242 4675 patch_prober.go:28] interesting pod/apiserver-76f77b778f-s7kmk container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Feb 16 19:44:30 crc kubenswrapper[4675]: [+]log ok Feb 16 19:44:30 crc kubenswrapper[4675]: [+]etcd ok Feb 16 19:44:30 crc kubenswrapper[4675]: [+]poststarthook/start-apiserver-admission-initializer ok Feb 16 19:44:30 crc kubenswrapper[4675]: [+]poststarthook/generic-apiserver-start-informers ok Feb 16 19:44:30 crc kubenswrapper[4675]: [+]poststarthook/max-in-flight-filter ok Feb 16 19:44:30 crc kubenswrapper[4675]: [+]poststarthook/storage-object-count-tracker-hook ok Feb 16 19:44:30 crc kubenswrapper[4675]: [+]poststarthook/image.openshift.io-apiserver-caches ok Feb 16 19:44:30 crc kubenswrapper[4675]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Feb 16 19:44:30 crc kubenswrapper[4675]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Feb 16 19:44:30 crc kubenswrapper[4675]: [+]poststarthook/project.openshift.io-projectcache ok Feb 16 19:44:30 crc kubenswrapper[4675]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Feb 16 19:44:30 crc kubenswrapper[4675]: [-]poststarthook/openshift.io-startinformers failed: reason withheld Feb 16 19:44:30 crc kubenswrapper[4675]: [+]poststarthook/openshift.io-restmapperupdater ok Feb 16 19:44:30 crc kubenswrapper[4675]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Feb 16 19:44:30 crc kubenswrapper[4675]: livez check failed Feb 16 19:44:30 crc kubenswrapper[4675]: I0216 19:44:30.824310 4675 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-s7kmk" podUID="76908270-f3ef-412b-a18d-065cf006461e" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 16 19:44:30 crc kubenswrapper[4675]: I0216 19:44:30.834966 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fg6bd" Feb 16 19:44:30 crc kubenswrapper[4675]: I0216 19:44:30.907718 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-vjgs5" event={"ID":"b3876426-ba16-40ed-a0f8-aeb0cdb9ae0f","Type":"ContainerStarted","Data":"dabbe5901aa534480dc63ee8b905f52b7d9a78df20063d10e422fdda5a8e6956"} Feb 16 19:44:30 crc kubenswrapper[4675]: I0216 19:44:30.907792 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-vjgs5" event={"ID":"b3876426-ba16-40ed-a0f8-aeb0cdb9ae0f","Type":"ContainerStarted","Data":"509866967e8211beb41ede78f83ef7087409532ddde0bbe4169d7fee698c718e"} Feb 16 19:44:30 crc kubenswrapper[4675]: I0216 19:44:30.909930 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29521170-6rzxb" event={"ID":"7a536c24-ac8f-4add-96e8-064bbcd40ba6","Type":"ContainerDied","Data":"aaf64f03c469b364570831b6285e8fa513b7b9ff1ed74930e06e5806c63ea08f"} Feb 16 19:44:30 crc kubenswrapper[4675]: I0216 19:44:30.909956 4675 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aaf64f03c469b364570831b6285e8fa513b7b9ff1ed74930e06e5806c63ea08f" Feb 16 19:44:30 crc kubenswrapper[4675]: I0216 19:44:30.916575 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gfhrq" event={"ID":"c1ac438d-9403-455d-869e-7d9cc34cf15f","Type":"ContainerStarted","Data":"58cebcf6fe579e493c54c2381f509a9ec4673a68ad89651d1637936883489756"} Feb 16 19:44:30 crc kubenswrapper[4675]: I0216 19:44:30.918735 4675 generic.go:334] "Generic (PLEG): container finished" podID="eb2e7e26-8ed8-4cec-8f78-7d02cce6bb33" containerID="a51b44a8dd25c0d3cc67de94ee4f65f34d9c3b7027804dcdd04a0b205a346726" exitCode=0 Feb 16 19:44:30 crc kubenswrapper[4675]: I0216 19:44:30.918829 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h6msp" event={"ID":"eb2e7e26-8ed8-4cec-8f78-7d02cce6bb33","Type":"ContainerDied","Data":"a51b44a8dd25c0d3cc67de94ee4f65f34d9c3b7027804dcdd04a0b205a346726"} Feb 16 19:44:30 crc kubenswrapper[4675]: I0216 19:44:30.918856 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h6msp" event={"ID":"eb2e7e26-8ed8-4cec-8f78-7d02cce6bb33","Type":"ContainerStarted","Data":"e85cc4ea14490b679737814c0df7875e00d5c61d44988dec7dceec4ae91ba873"} Feb 16 19:44:30 crc kubenswrapper[4675]: I0216 19:44:30.921820 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ctjpc"] Feb 16 19:44:30 crc kubenswrapper[4675]: I0216 19:44:30.922648 4675 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 16 19:44:30 crc kubenswrapper[4675]: I0216 19:44:30.932243 4675 generic.go:334] "Generic (PLEG): container finished" podID="ee9cd13e-3dd5-4fbf-8989-965350dff2e8" containerID="142797d8f95b84afe1930cda9106e6e018bb86ca29b8f8d1e365525a90c15786" exitCode=0 Feb 16 19:44:30 crc kubenswrapper[4675]: I0216 19:44:30.932318 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-grmd4" event={"ID":"ee9cd13e-3dd5-4fbf-8989-965350dff2e8","Type":"ContainerDied","Data":"142797d8f95b84afe1930cda9106e6e018bb86ca29b8f8d1e365525a90c15786"} Feb 16 19:44:30 crc kubenswrapper[4675]: I0216 19:44:30.932357 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-grmd4" event={"ID":"ee9cd13e-3dd5-4fbf-8989-965350dff2e8","Type":"ContainerStarted","Data":"04d0e7b020fbbcb0085bf9028d67c2ff0e702b553087ffd5c58044ee91e18b30"} Feb 16 19:44:30 crc kubenswrapper[4675]: I0216 19:44:30.935360 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bbn8r" event={"ID":"edd3f360-2ab8-4310-ac1a-3473a48bd5ae","Type":"ContainerStarted","Data":"e1a580971a4a7f1ad3fead21ac66f6be49eef253bbdb140c59935c084123a160"} Feb 16 19:44:30 crc kubenswrapper[4675]: I0216 19:44:30.937763 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"66b8d4ad-0bb4-4d74-be6c-fde055dd85d6","Type":"ContainerStarted","Data":"9dc3de6516253fce61eb791040d2bdc5e2cb7060a284be141dc2d71286fa8c1a"} Feb 16 19:44:30 crc kubenswrapper[4675]: I0216 19:44:30.943166 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fg6bd" Feb 16 19:44:30 crc kubenswrapper[4675]: I0216 19:44:30.949357 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-vjgs5" podStartSLOduration=12.94933024 podStartE2EDuration="12.94933024s" podCreationTimestamp="2026-02-16 19:44:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 19:44:30.93149336 +0000 UTC m=+154.056782926" watchObservedRunningTime="2026-02-16 19:44:30.94933024 +0000 UTC m=+154.074619786" Feb 16 19:44:31 crc kubenswrapper[4675]: I0216 19:44:31.028808 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29521170-6rzxb" Feb 16 19:44:31 crc kubenswrapper[4675]: I0216 19:44:31.021885 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=4.0218559 podStartE2EDuration="4.0218559s" podCreationTimestamp="2026-02-16 19:44:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 19:44:31.017093825 +0000 UTC m=+154.142383411" watchObservedRunningTime="2026-02-16 19:44:31.0218559 +0000 UTC m=+154.147145456" Feb 16 19:44:31 crc kubenswrapper[4675]: I0216 19:44:31.086814 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ndhwt"] Feb 16 19:44:31 crc kubenswrapper[4675]: E0216 19:44:31.087595 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a536c24-ac8f-4add-96e8-064bbcd40ba6" containerName="collect-profiles" Feb 16 19:44:31 crc kubenswrapper[4675]: I0216 19:44:31.087616 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a536c24-ac8f-4add-96e8-064bbcd40ba6" containerName="collect-profiles" Feb 16 19:44:31 crc kubenswrapper[4675]: I0216 19:44:31.087741 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a536c24-ac8f-4add-96e8-064bbcd40ba6" containerName="collect-profiles" Feb 16 19:44:31 crc kubenswrapper[4675]: I0216 19:44:31.089431 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ndhwt" Feb 16 19:44:31 crc kubenswrapper[4675]: I0216 19:44:31.105306 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 16 19:44:31 crc kubenswrapper[4675]: I0216 19:44:31.128034 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ndhwt"] Feb 16 19:44:31 crc kubenswrapper[4675]: I0216 19:44:31.164599 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 16 19:44:31 crc kubenswrapper[4675]: I0216 19:44:31.192390 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ftkjc\" (UniqueName: \"kubernetes.io/projected/7a536c24-ac8f-4add-96e8-064bbcd40ba6-kube-api-access-ftkjc\") pod \"7a536c24-ac8f-4add-96e8-064bbcd40ba6\" (UID: \"7a536c24-ac8f-4add-96e8-064bbcd40ba6\") " Feb 16 19:44:31 crc kubenswrapper[4675]: I0216 19:44:31.192505 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7a536c24-ac8f-4add-96e8-064bbcd40ba6-config-volume\") pod \"7a536c24-ac8f-4add-96e8-064bbcd40ba6\" (UID: \"7a536c24-ac8f-4add-96e8-064bbcd40ba6\") " Feb 16 19:44:31 crc kubenswrapper[4675]: I0216 19:44:31.192556 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7a536c24-ac8f-4add-96e8-064bbcd40ba6-secret-volume\") pod \"7a536c24-ac8f-4add-96e8-064bbcd40ba6\" (UID: \"7a536c24-ac8f-4add-96e8-064bbcd40ba6\") " Feb 16 19:44:31 crc kubenswrapper[4675]: I0216 19:44:31.192753 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fecbc375-b333-42c1-bff9-3d17abec2eb2-utilities\") pod \"redhat-marketplace-ndhwt\" (UID: \"fecbc375-b333-42c1-bff9-3d17abec2eb2\") " pod="openshift-marketplace/redhat-marketplace-ndhwt" Feb 16 19:44:31 crc kubenswrapper[4675]: I0216 19:44:31.192786 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8q2r\" (UniqueName: \"kubernetes.io/projected/fecbc375-b333-42c1-bff9-3d17abec2eb2-kube-api-access-j8q2r\") pod \"redhat-marketplace-ndhwt\" (UID: \"fecbc375-b333-42c1-bff9-3d17abec2eb2\") " pod="openshift-marketplace/redhat-marketplace-ndhwt" Feb 16 19:44:31 crc kubenswrapper[4675]: I0216 19:44:31.192853 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fecbc375-b333-42c1-bff9-3d17abec2eb2-catalog-content\") pod \"redhat-marketplace-ndhwt\" (UID: \"fecbc375-b333-42c1-bff9-3d17abec2eb2\") " pod="openshift-marketplace/redhat-marketplace-ndhwt" Feb 16 19:44:31 crc kubenswrapper[4675]: I0216 19:44:31.196569 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a536c24-ac8f-4add-96e8-064bbcd40ba6-config-volume" (OuterVolumeSpecName: "config-volume") pod "7a536c24-ac8f-4add-96e8-064bbcd40ba6" (UID: "7a536c24-ac8f-4add-96e8-064bbcd40ba6"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 19:44:31 crc kubenswrapper[4675]: I0216 19:44:31.203531 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a536c24-ac8f-4add-96e8-064bbcd40ba6-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7a536c24-ac8f-4add-96e8-064bbcd40ba6" (UID: "7a536c24-ac8f-4add-96e8-064bbcd40ba6"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 19:44:31 crc kubenswrapper[4675]: I0216 19:44:31.204627 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a536c24-ac8f-4add-96e8-064bbcd40ba6-kube-api-access-ftkjc" (OuterVolumeSpecName: "kube-api-access-ftkjc") pod "7a536c24-ac8f-4add-96e8-064bbcd40ba6" (UID: "7a536c24-ac8f-4add-96e8-064bbcd40ba6"). InnerVolumeSpecName "kube-api-access-ftkjc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 19:44:31 crc kubenswrapper[4675]: I0216 19:44:31.296759 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fecbc375-b333-42c1-bff9-3d17abec2eb2-utilities\") pod \"redhat-marketplace-ndhwt\" (UID: \"fecbc375-b333-42c1-bff9-3d17abec2eb2\") " pod="openshift-marketplace/redhat-marketplace-ndhwt" Feb 16 19:44:31 crc kubenswrapper[4675]: I0216 19:44:31.296833 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8q2r\" (UniqueName: \"kubernetes.io/projected/fecbc375-b333-42c1-bff9-3d17abec2eb2-kube-api-access-j8q2r\") pod \"redhat-marketplace-ndhwt\" (UID: \"fecbc375-b333-42c1-bff9-3d17abec2eb2\") " pod="openshift-marketplace/redhat-marketplace-ndhwt" Feb 16 19:44:31 crc kubenswrapper[4675]: I0216 19:44:31.296887 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fecbc375-b333-42c1-bff9-3d17abec2eb2-catalog-content\") pod \"redhat-marketplace-ndhwt\" (UID: \"fecbc375-b333-42c1-bff9-3d17abec2eb2\") " pod="openshift-marketplace/redhat-marketplace-ndhwt" Feb 16 19:44:31 crc kubenswrapper[4675]: I0216 19:44:31.296987 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ftkjc\" (UniqueName: \"kubernetes.io/projected/7a536c24-ac8f-4add-96e8-064bbcd40ba6-kube-api-access-ftkjc\") on node \"crc\" DevicePath \"\"" Feb 16 19:44:31 crc kubenswrapper[4675]: I0216 19:44:31.296998 4675 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7a536c24-ac8f-4add-96e8-064bbcd40ba6-config-volume\") on node \"crc\" DevicePath \"\"" Feb 16 19:44:31 crc kubenswrapper[4675]: I0216 19:44:31.297010 4675 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7a536c24-ac8f-4add-96e8-064bbcd40ba6-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 16 19:44:31 crc kubenswrapper[4675]: I0216 19:44:31.298106 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fecbc375-b333-42c1-bff9-3d17abec2eb2-catalog-content\") pod \"redhat-marketplace-ndhwt\" (UID: \"fecbc375-b333-42c1-bff9-3d17abec2eb2\") " pod="openshift-marketplace/redhat-marketplace-ndhwt" Feb 16 19:44:31 crc kubenswrapper[4675]: I0216 19:44:31.298213 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fecbc375-b333-42c1-bff9-3d17abec2eb2-utilities\") pod \"redhat-marketplace-ndhwt\" (UID: \"fecbc375-b333-42c1-bff9-3d17abec2eb2\") " pod="openshift-marketplace/redhat-marketplace-ndhwt" Feb 16 19:44:31 crc kubenswrapper[4675]: I0216 19:44:31.325100 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8q2r\" (UniqueName: \"kubernetes.io/projected/fecbc375-b333-42c1-bff9-3d17abec2eb2-kube-api-access-j8q2r\") pod \"redhat-marketplace-ndhwt\" (UID: \"fecbc375-b333-42c1-bff9-3d17abec2eb2\") " pod="openshift-marketplace/redhat-marketplace-ndhwt" Feb 16 19:44:31 crc kubenswrapper[4675]: I0216 19:44:31.439233 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4p9pw"] Feb 16 19:44:31 crc kubenswrapper[4675]: I0216 19:44:31.440605 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4p9pw" Feb 16 19:44:31 crc kubenswrapper[4675]: I0216 19:44:31.465437 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4p9pw"] Feb 16 19:44:31 crc kubenswrapper[4675]: I0216 19:44:31.505892 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-hrp82" Feb 16 19:44:31 crc kubenswrapper[4675]: I0216 19:44:31.510585 4675 patch_prober.go:28] interesting pod/router-default-5444994796-hrp82 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 16 19:44:31 crc kubenswrapper[4675]: [-]has-synced failed: reason withheld Feb 16 19:44:31 crc kubenswrapper[4675]: [+]process-running ok Feb 16 19:44:31 crc kubenswrapper[4675]: healthz check failed Feb 16 19:44:31 crc kubenswrapper[4675]: I0216 19:44:31.510642 4675 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hrp82" podUID="e02c5612-7f04-4ff9-902e-45c4436e01ed" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 16 19:44:31 crc kubenswrapper[4675]: I0216 19:44:31.585137 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ndhwt" Feb 16 19:44:31 crc kubenswrapper[4675]: I0216 19:44:31.607547 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd4b563d-e717-4e75-9a4f-f069626a819a-catalog-content\") pod \"redhat-marketplace-4p9pw\" (UID: \"dd4b563d-e717-4e75-9a4f-f069626a819a\") " pod="openshift-marketplace/redhat-marketplace-4p9pw" Feb 16 19:44:31 crc kubenswrapper[4675]: I0216 19:44:31.607603 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sd6tl\" (UniqueName: \"kubernetes.io/projected/dd4b563d-e717-4e75-9a4f-f069626a819a-kube-api-access-sd6tl\") pod \"redhat-marketplace-4p9pw\" (UID: \"dd4b563d-e717-4e75-9a4f-f069626a819a\") " pod="openshift-marketplace/redhat-marketplace-4p9pw" Feb 16 19:44:31 crc kubenswrapper[4675]: I0216 19:44:31.607733 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd4b563d-e717-4e75-9a4f-f069626a819a-utilities\") pod \"redhat-marketplace-4p9pw\" (UID: \"dd4b563d-e717-4e75-9a4f-f069626a819a\") " pod="openshift-marketplace/redhat-marketplace-4p9pw" Feb 16 19:44:31 crc kubenswrapper[4675]: I0216 19:44:31.709273 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd4b563d-e717-4e75-9a4f-f069626a819a-utilities\") pod \"redhat-marketplace-4p9pw\" (UID: \"dd4b563d-e717-4e75-9a4f-f069626a819a\") " pod="openshift-marketplace/redhat-marketplace-4p9pw" Feb 16 19:44:31 crc kubenswrapper[4675]: I0216 19:44:31.709341 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd4b563d-e717-4e75-9a4f-f069626a819a-catalog-content\") pod \"redhat-marketplace-4p9pw\" (UID: \"dd4b563d-e717-4e75-9a4f-f069626a819a\") " pod="openshift-marketplace/redhat-marketplace-4p9pw" Feb 16 19:44:31 crc kubenswrapper[4675]: I0216 19:44:31.709367 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sd6tl\" (UniqueName: \"kubernetes.io/projected/dd4b563d-e717-4e75-9a4f-f069626a819a-kube-api-access-sd6tl\") pod \"redhat-marketplace-4p9pw\" (UID: \"dd4b563d-e717-4e75-9a4f-f069626a819a\") " pod="openshift-marketplace/redhat-marketplace-4p9pw" Feb 16 19:44:31 crc kubenswrapper[4675]: I0216 19:44:31.710209 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd4b563d-e717-4e75-9a4f-f069626a819a-utilities\") pod \"redhat-marketplace-4p9pw\" (UID: \"dd4b563d-e717-4e75-9a4f-f069626a819a\") " pod="openshift-marketplace/redhat-marketplace-4p9pw" Feb 16 19:44:31 crc kubenswrapper[4675]: I0216 19:44:31.710425 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd4b563d-e717-4e75-9a4f-f069626a819a-catalog-content\") pod \"redhat-marketplace-4p9pw\" (UID: \"dd4b563d-e717-4e75-9a4f-f069626a819a\") " pod="openshift-marketplace/redhat-marketplace-4p9pw" Feb 16 19:44:31 crc kubenswrapper[4675]: I0216 19:44:31.729516 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sd6tl\" (UniqueName: \"kubernetes.io/projected/dd4b563d-e717-4e75-9a4f-f069626a819a-kube-api-access-sd6tl\") pod \"redhat-marketplace-4p9pw\" (UID: \"dd4b563d-e717-4e75-9a4f-f069626a819a\") " pod="openshift-marketplace/redhat-marketplace-4p9pw" Feb 16 19:44:31 crc kubenswrapper[4675]: I0216 19:44:31.762912 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4p9pw" Feb 16 19:44:31 crc kubenswrapper[4675]: I0216 19:44:31.903121 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Feb 16 19:44:31 crc kubenswrapper[4675]: I0216 19:44:31.904410 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ndhwt"] Feb 16 19:44:31 crc kubenswrapper[4675]: I0216 19:44:31.955911 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ndhwt" event={"ID":"fecbc375-b333-42c1-bff9-3d17abec2eb2","Type":"ContainerStarted","Data":"20eb1ad8332fe7b0c9da19df0b9721d3d48649ebcded7ef29598d52f33b3f041"} Feb 16 19:44:31 crc kubenswrapper[4675]: I0216 19:44:31.959745 4675 generic.go:334] "Generic (PLEG): container finished" podID="edd3f360-2ab8-4310-ac1a-3473a48bd5ae" containerID="16ff49807bf4c43bcd5696854ceb33ebd19d5f1afa99f2673f2701c561ec009f" exitCode=0 Feb 16 19:44:31 crc kubenswrapper[4675]: I0216 19:44:31.960101 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bbn8r" event={"ID":"edd3f360-2ab8-4310-ac1a-3473a48bd5ae","Type":"ContainerDied","Data":"16ff49807bf4c43bcd5696854ceb33ebd19d5f1afa99f2673f2701c561ec009f"} Feb 16 19:44:31 crc kubenswrapper[4675]: I0216 19:44:31.962542 4675 generic.go:334] "Generic (PLEG): container finished" podID="66b8d4ad-0bb4-4d74-be6c-fde055dd85d6" containerID="9dc3de6516253fce61eb791040d2bdc5e2cb7060a284be141dc2d71286fa8c1a" exitCode=0 Feb 16 19:44:31 crc kubenswrapper[4675]: I0216 19:44:31.962863 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"66b8d4ad-0bb4-4d74-be6c-fde055dd85d6","Type":"ContainerDied","Data":"9dc3de6516253fce61eb791040d2bdc5e2cb7060a284be141dc2d71286fa8c1a"} Feb 16 19:44:31 crc kubenswrapper[4675]: I0216 19:44:31.968574 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"6eaac5e7-0e5c-421c-9805-d2e10271af87","Type":"ContainerStarted","Data":"4df14799790edefc4f93445280356b9772abbff9cd8bf4ecc4019ea1c332d072"} Feb 16 19:44:31 crc kubenswrapper[4675]: I0216 19:44:31.968653 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"6eaac5e7-0e5c-421c-9805-d2e10271af87","Type":"ContainerStarted","Data":"e5491f87d10faea65caf9877d97616a4f11cf1005ae1ee9a82fffa5eec749107"} Feb 16 19:44:31 crc kubenswrapper[4675]: I0216 19:44:31.989099 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-ctjpc" event={"ID":"2a8b7c95-1c8f-4de9-907c-34b0b0848b13","Type":"ContainerStarted","Data":"9bf239a77dc428c119502333d5d52fda020025c93f856b9b004cc880a1b2ffc2"} Feb 16 19:44:31 crc kubenswrapper[4675]: I0216 19:44:31.989156 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-ctjpc" event={"ID":"2a8b7c95-1c8f-4de9-907c-34b0b0848b13","Type":"ContainerStarted","Data":"0c9d7007fe2bfbd301fb6f11d385445d37255643b4ab60860b542f783eb1f478"} Feb 16 19:44:31 crc kubenswrapper[4675]: I0216 19:44:31.990081 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-ctjpc" Feb 16 19:44:31 crc kubenswrapper[4675]: I0216 19:44:31.992036 4675 generic.go:334] "Generic (PLEG): container finished" podID="c1ac438d-9403-455d-869e-7d9cc34cf15f" containerID="c5dc11cf020461835bf53e389d30c924e97df969eb8e5a66bb03980a2007d100" exitCode=0 Feb 16 19:44:31 crc kubenswrapper[4675]: I0216 19:44:31.992109 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29521170-6rzxb" Feb 16 19:44:31 crc kubenswrapper[4675]: I0216 19:44:31.993069 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gfhrq" event={"ID":"c1ac438d-9403-455d-869e-7d9cc34cf15f","Type":"ContainerDied","Data":"c5dc11cf020461835bf53e389d30c924e97df969eb8e5a66bb03980a2007d100"} Feb 16 19:44:32 crc kubenswrapper[4675]: I0216 19:44:32.026421 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kfrnk"] Feb 16 19:44:32 crc kubenswrapper[4675]: I0216 19:44:32.030137 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kfrnk" Feb 16 19:44:32 crc kubenswrapper[4675]: I0216 19:44:32.032774 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 16 19:44:32 crc kubenswrapper[4675]: I0216 19:44:32.048491 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.048467372 podStartE2EDuration="2.048467372s" podCreationTimestamp="2026-02-16 19:44:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 19:44:32.028085626 +0000 UTC m=+155.153375182" watchObservedRunningTime="2026-02-16 19:44:32.048467372 +0000 UTC m=+155.173756918" Feb 16 19:44:32 crc kubenswrapper[4675]: I0216 19:44:32.048675 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kfrnk"] Feb 16 19:44:32 crc kubenswrapper[4675]: I0216 19:44:32.130298 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a833f00b-0fcb-416e-be3e-c4344adeef8d-catalog-content\") pod \"redhat-operators-kfrnk\" (UID: \"a833f00b-0fcb-416e-be3e-c4344adeef8d\") " pod="openshift-marketplace/redhat-operators-kfrnk" Feb 16 19:44:32 crc kubenswrapper[4675]: I0216 19:44:32.130363 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2lsl\" (UniqueName: \"kubernetes.io/projected/a833f00b-0fcb-416e-be3e-c4344adeef8d-kube-api-access-x2lsl\") pod \"redhat-operators-kfrnk\" (UID: \"a833f00b-0fcb-416e-be3e-c4344adeef8d\") " pod="openshift-marketplace/redhat-operators-kfrnk" Feb 16 19:44:32 crc kubenswrapper[4675]: I0216 19:44:32.130553 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a833f00b-0fcb-416e-be3e-c4344adeef8d-utilities\") pod \"redhat-operators-kfrnk\" (UID: \"a833f00b-0fcb-416e-be3e-c4344adeef8d\") " pod="openshift-marketplace/redhat-operators-kfrnk" Feb 16 19:44:32 crc kubenswrapper[4675]: I0216 19:44:32.161994 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-ctjpc" podStartSLOduration=132.161965931 podStartE2EDuration="2m12.161965931s" podCreationTimestamp="2026-02-16 19:42:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 19:44:32.087377387 +0000 UTC m=+155.212666953" watchObservedRunningTime="2026-02-16 19:44:32.161965931 +0000 UTC m=+155.287255477" Feb 16 19:44:32 crc kubenswrapper[4675]: I0216 19:44:32.233776 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a833f00b-0fcb-416e-be3e-c4344adeef8d-utilities\") pod \"redhat-operators-kfrnk\" (UID: \"a833f00b-0fcb-416e-be3e-c4344adeef8d\") " pod="openshift-marketplace/redhat-operators-kfrnk" Feb 16 19:44:32 crc kubenswrapper[4675]: I0216 19:44:32.234018 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a833f00b-0fcb-416e-be3e-c4344adeef8d-catalog-content\") pod \"redhat-operators-kfrnk\" (UID: \"a833f00b-0fcb-416e-be3e-c4344adeef8d\") " pod="openshift-marketplace/redhat-operators-kfrnk" Feb 16 19:44:32 crc kubenswrapper[4675]: I0216 19:44:32.234054 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2lsl\" (UniqueName: \"kubernetes.io/projected/a833f00b-0fcb-416e-be3e-c4344adeef8d-kube-api-access-x2lsl\") pod \"redhat-operators-kfrnk\" (UID: \"a833f00b-0fcb-416e-be3e-c4344adeef8d\") " pod="openshift-marketplace/redhat-operators-kfrnk" Feb 16 19:44:32 crc kubenswrapper[4675]: I0216 19:44:32.235281 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a833f00b-0fcb-416e-be3e-c4344adeef8d-utilities\") pod \"redhat-operators-kfrnk\" (UID: \"a833f00b-0fcb-416e-be3e-c4344adeef8d\") " pod="openshift-marketplace/redhat-operators-kfrnk" Feb 16 19:44:32 crc kubenswrapper[4675]: I0216 19:44:32.235759 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a833f00b-0fcb-416e-be3e-c4344adeef8d-catalog-content\") pod \"redhat-operators-kfrnk\" (UID: \"a833f00b-0fcb-416e-be3e-c4344adeef8d\") " pod="openshift-marketplace/redhat-operators-kfrnk" Feb 16 19:44:32 crc kubenswrapper[4675]: I0216 19:44:32.240544 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4p9pw"] Feb 16 19:44:32 crc kubenswrapper[4675]: I0216 19:44:32.286458 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2lsl\" (UniqueName: \"kubernetes.io/projected/a833f00b-0fcb-416e-be3e-c4344adeef8d-kube-api-access-x2lsl\") pod \"redhat-operators-kfrnk\" (UID: \"a833f00b-0fcb-416e-be3e-c4344adeef8d\") " pod="openshift-marketplace/redhat-operators-kfrnk" Feb 16 19:44:32 crc kubenswrapper[4675]: I0216 19:44:32.410620 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-nqmcd"] Feb 16 19:44:32 crc kubenswrapper[4675]: I0216 19:44:32.411850 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nqmcd" Feb 16 19:44:32 crc kubenswrapper[4675]: I0216 19:44:32.419782 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kfrnk" Feb 16 19:44:32 crc kubenswrapper[4675]: I0216 19:44:32.446955 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nqmcd"] Feb 16 19:44:32 crc kubenswrapper[4675]: I0216 19:44:32.511095 4675 patch_prober.go:28] interesting pod/router-default-5444994796-hrp82 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 16 19:44:32 crc kubenswrapper[4675]: [-]has-synced failed: reason withheld Feb 16 19:44:32 crc kubenswrapper[4675]: [+]process-running ok Feb 16 19:44:32 crc kubenswrapper[4675]: healthz check failed Feb 16 19:44:32 crc kubenswrapper[4675]: I0216 19:44:32.511168 4675 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hrp82" podUID="e02c5612-7f04-4ff9-902e-45c4436e01ed" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 16 19:44:32 crc kubenswrapper[4675]: I0216 19:44:32.546922 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96436f43-1dfe-4bba-98d5-ee0f45f78415-catalog-content\") pod \"redhat-operators-nqmcd\" (UID: \"96436f43-1dfe-4bba-98d5-ee0f45f78415\") " pod="openshift-marketplace/redhat-operators-nqmcd" Feb 16 19:44:32 crc kubenswrapper[4675]: I0216 19:44:32.546976 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z85nv\" (UniqueName: \"kubernetes.io/projected/96436f43-1dfe-4bba-98d5-ee0f45f78415-kube-api-access-z85nv\") pod \"redhat-operators-nqmcd\" (UID: \"96436f43-1dfe-4bba-98d5-ee0f45f78415\") " pod="openshift-marketplace/redhat-operators-nqmcd" Feb 16 19:44:32 crc kubenswrapper[4675]: I0216 19:44:32.547017 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96436f43-1dfe-4bba-98d5-ee0f45f78415-utilities\") pod \"redhat-operators-nqmcd\" (UID: \"96436f43-1dfe-4bba-98d5-ee0f45f78415\") " pod="openshift-marketplace/redhat-operators-nqmcd" Feb 16 19:44:32 crc kubenswrapper[4675]: I0216 19:44:32.649589 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96436f43-1dfe-4bba-98d5-ee0f45f78415-utilities\") pod \"redhat-operators-nqmcd\" (UID: \"96436f43-1dfe-4bba-98d5-ee0f45f78415\") " pod="openshift-marketplace/redhat-operators-nqmcd" Feb 16 19:44:32 crc kubenswrapper[4675]: I0216 19:44:32.649726 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96436f43-1dfe-4bba-98d5-ee0f45f78415-catalog-content\") pod \"redhat-operators-nqmcd\" (UID: \"96436f43-1dfe-4bba-98d5-ee0f45f78415\") " pod="openshift-marketplace/redhat-operators-nqmcd" Feb 16 19:44:32 crc kubenswrapper[4675]: I0216 19:44:32.649751 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z85nv\" (UniqueName: \"kubernetes.io/projected/96436f43-1dfe-4bba-98d5-ee0f45f78415-kube-api-access-z85nv\") pod \"redhat-operators-nqmcd\" (UID: \"96436f43-1dfe-4bba-98d5-ee0f45f78415\") " pod="openshift-marketplace/redhat-operators-nqmcd" Feb 16 19:44:32 crc kubenswrapper[4675]: I0216 19:44:32.659286 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96436f43-1dfe-4bba-98d5-ee0f45f78415-catalog-content\") pod \"redhat-operators-nqmcd\" (UID: \"96436f43-1dfe-4bba-98d5-ee0f45f78415\") " pod="openshift-marketplace/redhat-operators-nqmcd" Feb 16 19:44:32 crc kubenswrapper[4675]: I0216 19:44:32.659592 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96436f43-1dfe-4bba-98d5-ee0f45f78415-utilities\") pod \"redhat-operators-nqmcd\" (UID: \"96436f43-1dfe-4bba-98d5-ee0f45f78415\") " pod="openshift-marketplace/redhat-operators-nqmcd" Feb 16 19:44:32 crc kubenswrapper[4675]: I0216 19:44:32.677870 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z85nv\" (UniqueName: \"kubernetes.io/projected/96436f43-1dfe-4bba-98d5-ee0f45f78415-kube-api-access-z85nv\") pod \"redhat-operators-nqmcd\" (UID: \"96436f43-1dfe-4bba-98d5-ee0f45f78415\") " pod="openshift-marketplace/redhat-operators-nqmcd" Feb 16 19:44:32 crc kubenswrapper[4675]: I0216 19:44:32.747365 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nqmcd" Feb 16 19:44:32 crc kubenswrapper[4675]: I0216 19:44:32.877218 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kfrnk"] Feb 16 19:44:32 crc kubenswrapper[4675]: W0216 19:44:32.895515 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda833f00b_0fcb_416e_be3e_c4344adeef8d.slice/crio-5159449c1af4355c29d8f6617020fb2ed75eb703b79b15b82648edfe721eb4a6 WatchSource:0}: Error finding container 5159449c1af4355c29d8f6617020fb2ed75eb703b79b15b82648edfe721eb4a6: Status 404 returned error can't find the container with id 5159449c1af4355c29d8f6617020fb2ed75eb703b79b15b82648edfe721eb4a6 Feb 16 19:44:33 crc kubenswrapper[4675]: I0216 19:44:33.064302 4675 generic.go:334] "Generic (PLEG): container finished" podID="6eaac5e7-0e5c-421c-9805-d2e10271af87" containerID="4df14799790edefc4f93445280356b9772abbff9cd8bf4ecc4019ea1c332d072" exitCode=0 Feb 16 19:44:33 crc kubenswrapper[4675]: I0216 19:44:33.064421 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"6eaac5e7-0e5c-421c-9805-d2e10271af87","Type":"ContainerDied","Data":"4df14799790edefc4f93445280356b9772abbff9cd8bf4ecc4019ea1c332d072"} Feb 16 19:44:33 crc kubenswrapper[4675]: I0216 19:44:33.077222 4675 generic.go:334] "Generic (PLEG): container finished" podID="dd4b563d-e717-4e75-9a4f-f069626a819a" containerID="4f5061c85516b54e8ac021280919bb4299d7ebc5bf6cba3f339b865f7a89d880" exitCode=0 Feb 16 19:44:33 crc kubenswrapper[4675]: I0216 19:44:33.077366 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4p9pw" event={"ID":"dd4b563d-e717-4e75-9a4f-f069626a819a","Type":"ContainerDied","Data":"4f5061c85516b54e8ac021280919bb4299d7ebc5bf6cba3f339b865f7a89d880"} Feb 16 19:44:33 crc kubenswrapper[4675]: I0216 19:44:33.077427 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4p9pw" event={"ID":"dd4b563d-e717-4e75-9a4f-f069626a819a","Type":"ContainerStarted","Data":"f2156441ab0e48dc3c1bf34f96489a0bbeae30fcd2aa2e9ae819d5cc6e2158e4"} Feb 16 19:44:33 crc kubenswrapper[4675]: I0216 19:44:33.079576 4675 generic.go:334] "Generic (PLEG): container finished" podID="fecbc375-b333-42c1-bff9-3d17abec2eb2" containerID="558bf60a74aeb3696d678168de44dd06804866c4fabf9f88a8196ddcbba2f3ba" exitCode=0 Feb 16 19:44:33 crc kubenswrapper[4675]: I0216 19:44:33.079762 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ndhwt" event={"ID":"fecbc375-b333-42c1-bff9-3d17abec2eb2","Type":"ContainerDied","Data":"558bf60a74aeb3696d678168de44dd06804866c4fabf9f88a8196ddcbba2f3ba"} Feb 16 19:44:33 crc kubenswrapper[4675]: I0216 19:44:33.086498 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kfrnk" event={"ID":"a833f00b-0fcb-416e-be3e-c4344adeef8d","Type":"ContainerStarted","Data":"5159449c1af4355c29d8f6617020fb2ed75eb703b79b15b82648edfe721eb4a6"} Feb 16 19:44:33 crc kubenswrapper[4675]: I0216 19:44:33.199070 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nqmcd"] Feb 16 19:44:33 crc kubenswrapper[4675]: W0216 19:44:33.219100 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96436f43_1dfe_4bba_98d5_ee0f45f78415.slice/crio-f3b67a9728eaa915754ce8443088933880a3ad5d6785ead893481d49647dba42 WatchSource:0}: Error finding container f3b67a9728eaa915754ce8443088933880a3ad5d6785ead893481d49647dba42: Status 404 returned error can't find the container with id f3b67a9728eaa915754ce8443088933880a3ad5d6785ead893481d49647dba42 Feb 16 19:44:33 crc kubenswrapper[4675]: I0216 19:44:33.424563 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 16 19:44:33 crc kubenswrapper[4675]: I0216 19:44:33.513548 4675 patch_prober.go:28] interesting pod/router-default-5444994796-hrp82 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 16 19:44:33 crc kubenswrapper[4675]: [-]has-synced failed: reason withheld Feb 16 19:44:33 crc kubenswrapper[4675]: [+]process-running ok Feb 16 19:44:33 crc kubenswrapper[4675]: healthz check failed Feb 16 19:44:33 crc kubenswrapper[4675]: I0216 19:44:33.513636 4675 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hrp82" podUID="e02c5612-7f04-4ff9-902e-45c4436e01ed" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 16 19:44:33 crc kubenswrapper[4675]: I0216 19:44:33.573836 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/66b8d4ad-0bb4-4d74-be6c-fde055dd85d6-kubelet-dir\") pod \"66b8d4ad-0bb4-4d74-be6c-fde055dd85d6\" (UID: \"66b8d4ad-0bb4-4d74-be6c-fde055dd85d6\") " Feb 16 19:44:33 crc kubenswrapper[4675]: I0216 19:44:33.573922 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/66b8d4ad-0bb4-4d74-be6c-fde055dd85d6-kube-api-access\") pod \"66b8d4ad-0bb4-4d74-be6c-fde055dd85d6\" (UID: \"66b8d4ad-0bb4-4d74-be6c-fde055dd85d6\") " Feb 16 19:44:33 crc kubenswrapper[4675]: I0216 19:44:33.573960 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/66b8d4ad-0bb4-4d74-be6c-fde055dd85d6-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "66b8d4ad-0bb4-4d74-be6c-fde055dd85d6" (UID: "66b8d4ad-0bb4-4d74-be6c-fde055dd85d6"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 16 19:44:33 crc kubenswrapper[4675]: I0216 19:44:33.574256 4675 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/66b8d4ad-0bb4-4d74-be6c-fde055dd85d6-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 16 19:44:33 crc kubenswrapper[4675]: I0216 19:44:33.580374 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66b8d4ad-0bb4-4d74-be6c-fde055dd85d6-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "66b8d4ad-0bb4-4d74-be6c-fde055dd85d6" (UID: "66b8d4ad-0bb4-4d74-be6c-fde055dd85d6"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 19:44:33 crc kubenswrapper[4675]: I0216 19:44:33.677199 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/66b8d4ad-0bb4-4d74-be6c-fde055dd85d6-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 16 19:44:34 crc kubenswrapper[4675]: I0216 19:44:34.114070 4675 generic.go:334] "Generic (PLEG): container finished" podID="96436f43-1dfe-4bba-98d5-ee0f45f78415" containerID="bc6738db2c5ede095f2c58dc545f5a958d65f642903af6f5db67c49011e419df" exitCode=0 Feb 16 19:44:34 crc kubenswrapper[4675]: I0216 19:44:34.114353 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nqmcd" event={"ID":"96436f43-1dfe-4bba-98d5-ee0f45f78415","Type":"ContainerDied","Data":"bc6738db2c5ede095f2c58dc545f5a958d65f642903af6f5db67c49011e419df"} Feb 16 19:44:34 crc kubenswrapper[4675]: I0216 19:44:34.114414 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nqmcd" event={"ID":"96436f43-1dfe-4bba-98d5-ee0f45f78415","Type":"ContainerStarted","Data":"f3b67a9728eaa915754ce8443088933880a3ad5d6785ead893481d49647dba42"} Feb 16 19:44:34 crc kubenswrapper[4675]: I0216 19:44:34.141270 4675 generic.go:334] "Generic (PLEG): container finished" podID="a833f00b-0fcb-416e-be3e-c4344adeef8d" containerID="2fdce9395e32381374bb162dc3e507917d276a2bd8696cfea1835a27ace6f7ea" exitCode=0 Feb 16 19:44:34 crc kubenswrapper[4675]: I0216 19:44:34.141790 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kfrnk" event={"ID":"a833f00b-0fcb-416e-be3e-c4344adeef8d","Type":"ContainerDied","Data":"2fdce9395e32381374bb162dc3e507917d276a2bd8696cfea1835a27ace6f7ea"} Feb 16 19:44:34 crc kubenswrapper[4675]: I0216 19:44:34.158105 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"66b8d4ad-0bb4-4d74-be6c-fde055dd85d6","Type":"ContainerDied","Data":"954d68788424d901ae8ded8e62fb128de11f3a5cea5af0d0ff8c083bfbc218c5"} Feb 16 19:44:34 crc kubenswrapper[4675]: I0216 19:44:34.158162 4675 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="954d68788424d901ae8ded8e62fb128de11f3a5cea5af0d0ff8c083bfbc218c5" Feb 16 19:44:34 crc kubenswrapper[4675]: I0216 19:44:34.158383 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 16 19:44:34 crc kubenswrapper[4675]: I0216 19:44:34.511525 4675 patch_prober.go:28] interesting pod/router-default-5444994796-hrp82 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 16 19:44:34 crc kubenswrapper[4675]: [-]has-synced failed: reason withheld Feb 16 19:44:34 crc kubenswrapper[4675]: [+]process-running ok Feb 16 19:44:34 crc kubenswrapper[4675]: healthz check failed Feb 16 19:44:34 crc kubenswrapper[4675]: I0216 19:44:34.511617 4675 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hrp82" podUID="e02c5612-7f04-4ff9-902e-45c4436e01ed" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 16 19:44:34 crc kubenswrapper[4675]: I0216 19:44:34.825158 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 16 19:44:34 crc kubenswrapper[4675]: I0216 19:44:34.909074 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6eaac5e7-0e5c-421c-9805-d2e10271af87-kube-api-access\") pod \"6eaac5e7-0e5c-421c-9805-d2e10271af87\" (UID: \"6eaac5e7-0e5c-421c-9805-d2e10271af87\") " Feb 16 19:44:34 crc kubenswrapper[4675]: I0216 19:44:34.909173 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6eaac5e7-0e5c-421c-9805-d2e10271af87-kubelet-dir\") pod \"6eaac5e7-0e5c-421c-9805-d2e10271af87\" (UID: \"6eaac5e7-0e5c-421c-9805-d2e10271af87\") " Feb 16 19:44:34 crc kubenswrapper[4675]: I0216 19:44:34.909276 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6eaac5e7-0e5c-421c-9805-d2e10271af87-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "6eaac5e7-0e5c-421c-9805-d2e10271af87" (UID: "6eaac5e7-0e5c-421c-9805-d2e10271af87"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 16 19:44:34 crc kubenswrapper[4675]: I0216 19:44:34.909594 4675 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6eaac5e7-0e5c-421c-9805-d2e10271af87-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 16 19:44:34 crc kubenswrapper[4675]: I0216 19:44:34.933169 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6eaac5e7-0e5c-421c-9805-d2e10271af87-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "6eaac5e7-0e5c-421c-9805-d2e10271af87" (UID: "6eaac5e7-0e5c-421c-9805-d2e10271af87"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 19:44:35 crc kubenswrapper[4675]: I0216 19:44:35.011122 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6eaac5e7-0e5c-421c-9805-d2e10271af87-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 16 19:44:35 crc kubenswrapper[4675]: I0216 19:44:35.221205 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"6eaac5e7-0e5c-421c-9805-d2e10271af87","Type":"ContainerDied","Data":"e5491f87d10faea65caf9877d97616a4f11cf1005ae1ee9a82fffa5eec749107"} Feb 16 19:44:35 crc kubenswrapper[4675]: I0216 19:44:35.221252 4675 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e5491f87d10faea65caf9877d97616a4f11cf1005ae1ee9a82fffa5eec749107" Feb 16 19:44:35 crc kubenswrapper[4675]: I0216 19:44:35.221315 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 16 19:44:35 crc kubenswrapper[4675]: I0216 19:44:35.509988 4675 patch_prober.go:28] interesting pod/router-default-5444994796-hrp82 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 16 19:44:35 crc kubenswrapper[4675]: [-]has-synced failed: reason withheld Feb 16 19:44:35 crc kubenswrapper[4675]: [+]process-running ok Feb 16 19:44:35 crc kubenswrapper[4675]: healthz check failed Feb 16 19:44:35 crc kubenswrapper[4675]: I0216 19:44:35.510353 4675 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hrp82" podUID="e02c5612-7f04-4ff9-902e-45c4436e01ed" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 16 19:44:35 crc kubenswrapper[4675]: I0216 19:44:35.809413 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-s7kmk" Feb 16 19:44:35 crc kubenswrapper[4675]: I0216 19:44:35.814727 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-s7kmk" Feb 16 19:44:36 crc kubenswrapper[4675]: I0216 19:44:36.509891 4675 patch_prober.go:28] interesting pod/router-default-5444994796-hrp82 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 16 19:44:36 crc kubenswrapper[4675]: [-]has-synced failed: reason withheld Feb 16 19:44:36 crc kubenswrapper[4675]: [+]process-running ok Feb 16 19:44:36 crc kubenswrapper[4675]: healthz check failed Feb 16 19:44:36 crc kubenswrapper[4675]: I0216 19:44:36.509978 4675 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hrp82" podUID="e02c5612-7f04-4ff9-902e-45c4436e01ed" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 16 19:44:36 crc kubenswrapper[4675]: I0216 19:44:36.613098 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-n8s5c" Feb 16 19:44:37 crc kubenswrapper[4675]: I0216 19:44:37.514234 4675 patch_prober.go:28] interesting pod/router-default-5444994796-hrp82 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 16 19:44:37 crc kubenswrapper[4675]: [-]has-synced failed: reason withheld Feb 16 19:44:37 crc kubenswrapper[4675]: [+]process-running ok Feb 16 19:44:37 crc kubenswrapper[4675]: healthz check failed Feb 16 19:44:37 crc kubenswrapper[4675]: I0216 19:44:37.514824 4675 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hrp82" podUID="e02c5612-7f04-4ff9-902e-45c4436e01ed" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 16 19:44:38 crc kubenswrapper[4675]: I0216 19:44:38.512943 4675 patch_prober.go:28] interesting pod/router-default-5444994796-hrp82 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 16 19:44:38 crc kubenswrapper[4675]: [-]has-synced failed: reason withheld Feb 16 19:44:38 crc kubenswrapper[4675]: [+]process-running ok Feb 16 19:44:38 crc kubenswrapper[4675]: healthz check failed Feb 16 19:44:38 crc kubenswrapper[4675]: I0216 19:44:38.513047 4675 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hrp82" podUID="e02c5612-7f04-4ff9-902e-45c4436e01ed" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 16 19:44:39 crc kubenswrapper[4675]: I0216 19:44:39.509297 4675 patch_prober.go:28] interesting pod/router-default-5444994796-hrp82 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 16 19:44:39 crc kubenswrapper[4675]: [-]has-synced failed: reason withheld Feb 16 19:44:39 crc kubenswrapper[4675]: [+]process-running ok Feb 16 19:44:39 crc kubenswrapper[4675]: healthz check failed Feb 16 19:44:39 crc kubenswrapper[4675]: I0216 19:44:39.510018 4675 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hrp82" podUID="e02c5612-7f04-4ff9-902e-45c4436e01ed" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 16 19:44:40 crc kubenswrapper[4675]: I0216 19:44:40.377021 4675 patch_prober.go:28] interesting pod/downloads-7954f5f757-6k7j5 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.33:8080/\": dial tcp 10.217.0.33:8080: connect: connection refused" start-of-body= Feb 16 19:44:40 crc kubenswrapper[4675]: I0216 19:44:40.377106 4675 patch_prober.go:28] interesting pod/downloads-7954f5f757-6k7j5 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.33:8080/\": dial tcp 10.217.0.33:8080: connect: connection refused" start-of-body= Feb 16 19:44:40 crc kubenswrapper[4675]: I0216 19:44:40.377137 4675 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-6k7j5" podUID="5d5f11f1-66e4-4700-9c37-f3843d1769a5" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.33:8080/\": dial tcp 10.217.0.33:8080: connect: connection refused" Feb 16 19:44:40 crc kubenswrapper[4675]: I0216 19:44:40.377398 4675 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-6k7j5" podUID="5d5f11f1-66e4-4700-9c37-f3843d1769a5" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.33:8080/\": dial tcp 10.217.0.33:8080: connect: connection refused" Feb 16 19:44:40 crc kubenswrapper[4675]: I0216 19:44:40.410923 4675 patch_prober.go:28] interesting pod/console-f9d7485db-chc9w container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.10:8443/health\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Feb 16 19:44:40 crc kubenswrapper[4675]: I0216 19:44:40.411048 4675 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-chc9w" podUID="061097a9-32e9-4972-989a-f3777a41bc2b" containerName="console" probeResult="failure" output="Get \"https://10.217.0.10:8443/health\": dial tcp 10.217.0.10:8443: connect: connection refused" Feb 16 19:44:40 crc kubenswrapper[4675]: I0216 19:44:40.529440 4675 patch_prober.go:28] interesting pod/router-default-5444994796-hrp82 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 16 19:44:40 crc kubenswrapper[4675]: [-]has-synced failed: reason withheld Feb 16 19:44:40 crc kubenswrapper[4675]: [+]process-running ok Feb 16 19:44:40 crc kubenswrapper[4675]: healthz check failed Feb 16 19:44:40 crc kubenswrapper[4675]: I0216 19:44:40.529581 4675 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hrp82" podUID="e02c5612-7f04-4ff9-902e-45c4436e01ed" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 16 19:44:41 crc kubenswrapper[4675]: I0216 19:44:41.512611 4675 patch_prober.go:28] interesting pod/router-default-5444994796-hrp82 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 16 19:44:41 crc kubenswrapper[4675]: [-]has-synced failed: reason withheld Feb 16 19:44:41 crc kubenswrapper[4675]: [+]process-running ok Feb 16 19:44:41 crc kubenswrapper[4675]: healthz check failed Feb 16 19:44:41 crc kubenswrapper[4675]: I0216 19:44:41.512848 4675 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hrp82" podUID="e02c5612-7f04-4ff9-902e-45c4436e01ed" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 16 19:44:42 crc kubenswrapper[4675]: I0216 19:44:42.508408 4675 patch_prober.go:28] interesting pod/router-default-5444994796-hrp82 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 16 19:44:42 crc kubenswrapper[4675]: [-]has-synced failed: reason withheld Feb 16 19:44:42 crc kubenswrapper[4675]: [+]process-running ok Feb 16 19:44:42 crc kubenswrapper[4675]: healthz check failed Feb 16 19:44:42 crc kubenswrapper[4675]: I0216 19:44:42.508494 4675 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hrp82" podUID="e02c5612-7f04-4ff9-902e-45c4436e01ed" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 16 19:44:43 crc kubenswrapper[4675]: I0216 19:44:43.221307 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8d5a5b47-38a4-4f7e-b40e-dba4825e18be-metrics-certs\") pod \"network-metrics-daemon-sbgjb\" (UID: \"8d5a5b47-38a4-4f7e-b40e-dba4825e18be\") " pod="openshift-multus/network-metrics-daemon-sbgjb" Feb 16 19:44:43 crc kubenswrapper[4675]: I0216 19:44:43.261423 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8d5a5b47-38a4-4f7e-b40e-dba4825e18be-metrics-certs\") pod \"network-metrics-daemon-sbgjb\" (UID: \"8d5a5b47-38a4-4f7e-b40e-dba4825e18be\") " pod="openshift-multus/network-metrics-daemon-sbgjb" Feb 16 19:44:43 crc kubenswrapper[4675]: I0216 19:44:43.309564 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbgjb" Feb 16 19:44:43 crc kubenswrapper[4675]: I0216 19:44:43.509945 4675 patch_prober.go:28] interesting pod/router-default-5444994796-hrp82 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 16 19:44:43 crc kubenswrapper[4675]: [-]has-synced failed: reason withheld Feb 16 19:44:43 crc kubenswrapper[4675]: [+]process-running ok Feb 16 19:44:43 crc kubenswrapper[4675]: healthz check failed Feb 16 19:44:43 crc kubenswrapper[4675]: I0216 19:44:43.510055 4675 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hrp82" podUID="e02c5612-7f04-4ff9-902e-45c4436e01ed" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 16 19:44:44 crc kubenswrapper[4675]: I0216 19:44:44.508626 4675 patch_prober.go:28] interesting pod/router-default-5444994796-hrp82 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 16 19:44:44 crc kubenswrapper[4675]: [-]has-synced failed: reason withheld Feb 16 19:44:44 crc kubenswrapper[4675]: [+]process-running ok Feb 16 19:44:44 crc kubenswrapper[4675]: healthz check failed Feb 16 19:44:44 crc kubenswrapper[4675]: I0216 19:44:44.508756 4675 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hrp82" podUID="e02c5612-7f04-4ff9-902e-45c4436e01ed" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 16 19:44:45 crc kubenswrapper[4675]: I0216 19:44:45.509717 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-hrp82" Feb 16 19:44:45 crc kubenswrapper[4675]: I0216 19:44:45.513801 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-hrp82" Feb 16 19:44:47 crc kubenswrapper[4675]: I0216 19:44:47.553746 4675 patch_prober.go:28] interesting pod/machine-config-daemon-j7pnb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 16 19:44:47 crc kubenswrapper[4675]: I0216 19:44:47.554281 4675 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" podUID="10414964-83d0-4d95-a89f-e3212a8015b5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 16 19:44:50 crc kubenswrapper[4675]: I0216 19:44:50.376059 4675 patch_prober.go:28] interesting pod/downloads-7954f5f757-6k7j5 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.33:8080/\": dial tcp 10.217.0.33:8080: connect: connection refused" start-of-body= Feb 16 19:44:50 crc kubenswrapper[4675]: I0216 19:44:50.376046 4675 patch_prober.go:28] interesting pod/downloads-7954f5f757-6k7j5 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.33:8080/\": dial tcp 10.217.0.33:8080: connect: connection refused" start-of-body= Feb 16 19:44:50 crc kubenswrapper[4675]: I0216 19:44:50.376557 4675 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-6k7j5" podUID="5d5f11f1-66e4-4700-9c37-f3843d1769a5" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.33:8080/\": dial tcp 10.217.0.33:8080: connect: connection refused" Feb 16 19:44:50 crc kubenswrapper[4675]: I0216 19:44:50.376440 4675 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-6k7j5" podUID="5d5f11f1-66e4-4700-9c37-f3843d1769a5" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.33:8080/\": dial tcp 10.217.0.33:8080: connect: connection refused" Feb 16 19:44:50 crc kubenswrapper[4675]: I0216 19:44:50.376720 4675 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console/downloads-7954f5f757-6k7j5" Feb 16 19:44:50 crc kubenswrapper[4675]: I0216 19:44:50.378215 4675 patch_prober.go:28] interesting pod/downloads-7954f5f757-6k7j5 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.33:8080/\": dial tcp 10.217.0.33:8080: connect: connection refused" start-of-body= Feb 16 19:44:50 crc kubenswrapper[4675]: I0216 19:44:50.378312 4675 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-6k7j5" podUID="5d5f11f1-66e4-4700-9c37-f3843d1769a5" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.33:8080/\": dial tcp 10.217.0.33:8080: connect: connection refused" Feb 16 19:44:50 crc kubenswrapper[4675]: I0216 19:44:50.380033 4675 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="download-server" containerStatusID={"Type":"cri-o","ID":"7fe56de1a673cb2ec0225543c42828c0571689017c03964e4e1a2e9c9b6985be"} pod="openshift-console/downloads-7954f5f757-6k7j5" containerMessage="Container download-server failed liveness probe, will be restarted" Feb 16 19:44:50 crc kubenswrapper[4675]: I0216 19:44:50.380528 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/downloads-7954f5f757-6k7j5" podUID="5d5f11f1-66e4-4700-9c37-f3843d1769a5" containerName="download-server" containerID="cri-o://7fe56de1a673cb2ec0225543c42828c0571689017c03964e4e1a2e9c9b6985be" gracePeriod=2 Feb 16 19:44:50 crc kubenswrapper[4675]: I0216 19:44:50.419136 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-chc9w" Feb 16 19:44:50 crc kubenswrapper[4675]: I0216 19:44:50.423979 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-chc9w" Feb 16 19:44:50 crc kubenswrapper[4675]: I0216 19:44:50.426110 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-ctjpc" Feb 16 19:44:51 crc kubenswrapper[4675]: I0216 19:44:51.449551 4675 generic.go:334] "Generic (PLEG): container finished" podID="5d5f11f1-66e4-4700-9c37-f3843d1769a5" containerID="7fe56de1a673cb2ec0225543c42828c0571689017c03964e4e1a2e9c9b6985be" exitCode=0 Feb 16 19:44:51 crc kubenswrapper[4675]: I0216 19:44:51.449980 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-6k7j5" event={"ID":"5d5f11f1-66e4-4700-9c37-f3843d1769a5","Type":"ContainerDied","Data":"7fe56de1a673cb2ec0225543c42828c0571689017c03964e4e1a2e9c9b6985be"} Feb 16 19:44:58 crc kubenswrapper[4675]: E0216 19:44:58.779347 4675 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 16 19:44:58 crc kubenswrapper[4675]: E0216 19:44:58.780324 4675 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-57kvs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-grmd4_openshift-marketplace(ee9cd13e-3dd5-4fbf-8989-965350dff2e8): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 16 19:44:58 crc kubenswrapper[4675]: E0216 19:44:58.781539 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-grmd4" podUID="ee9cd13e-3dd5-4fbf-8989-965350dff2e8" Feb 16 19:45:00 crc kubenswrapper[4675]: I0216 19:45:00.145023 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29521185-z8j6d"] Feb 16 19:45:00 crc kubenswrapper[4675]: E0216 19:45:00.145301 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6eaac5e7-0e5c-421c-9805-d2e10271af87" containerName="pruner" Feb 16 19:45:00 crc kubenswrapper[4675]: I0216 19:45:00.145314 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="6eaac5e7-0e5c-421c-9805-d2e10271af87" containerName="pruner" Feb 16 19:45:00 crc kubenswrapper[4675]: E0216 19:45:00.145326 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66b8d4ad-0bb4-4d74-be6c-fde055dd85d6" containerName="pruner" Feb 16 19:45:00 crc kubenswrapper[4675]: I0216 19:45:00.145332 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="66b8d4ad-0bb4-4d74-be6c-fde055dd85d6" containerName="pruner" Feb 16 19:45:00 crc kubenswrapper[4675]: I0216 19:45:00.145456 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="6eaac5e7-0e5c-421c-9805-d2e10271af87" containerName="pruner" Feb 16 19:45:00 crc kubenswrapper[4675]: I0216 19:45:00.145466 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="66b8d4ad-0bb4-4d74-be6c-fde055dd85d6" containerName="pruner" Feb 16 19:45:00 crc kubenswrapper[4675]: I0216 19:45:00.146039 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29521185-z8j6d" Feb 16 19:45:00 crc kubenswrapper[4675]: I0216 19:45:00.150630 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 16 19:45:00 crc kubenswrapper[4675]: I0216 19:45:00.150920 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 16 19:45:00 crc kubenswrapper[4675]: I0216 19:45:00.156234 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29521185-z8j6d"] Feb 16 19:45:00 crc kubenswrapper[4675]: I0216 19:45:00.312670 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96xq9\" (UniqueName: \"kubernetes.io/projected/ac6c431e-5c6b-4f37-aff2-ac4853838032-kube-api-access-96xq9\") pod \"collect-profiles-29521185-z8j6d\" (UID: \"ac6c431e-5c6b-4f37-aff2-ac4853838032\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521185-z8j6d" Feb 16 19:45:00 crc kubenswrapper[4675]: I0216 19:45:00.312755 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ac6c431e-5c6b-4f37-aff2-ac4853838032-secret-volume\") pod \"collect-profiles-29521185-z8j6d\" (UID: \"ac6c431e-5c6b-4f37-aff2-ac4853838032\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521185-z8j6d" Feb 16 19:45:00 crc kubenswrapper[4675]: I0216 19:45:00.312797 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ac6c431e-5c6b-4f37-aff2-ac4853838032-config-volume\") pod \"collect-profiles-29521185-z8j6d\" (UID: \"ac6c431e-5c6b-4f37-aff2-ac4853838032\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521185-z8j6d" Feb 16 19:45:00 crc kubenswrapper[4675]: I0216 19:45:00.376638 4675 patch_prober.go:28] interesting pod/downloads-7954f5f757-6k7j5 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.33:8080/\": dial tcp 10.217.0.33:8080: connect: connection refused" start-of-body= Feb 16 19:45:00 crc kubenswrapper[4675]: I0216 19:45:00.376779 4675 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-6k7j5" podUID="5d5f11f1-66e4-4700-9c37-f3843d1769a5" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.33:8080/\": dial tcp 10.217.0.33:8080: connect: connection refused" Feb 16 19:45:00 crc kubenswrapper[4675]: I0216 19:45:00.414625 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96xq9\" (UniqueName: \"kubernetes.io/projected/ac6c431e-5c6b-4f37-aff2-ac4853838032-kube-api-access-96xq9\") pod \"collect-profiles-29521185-z8j6d\" (UID: \"ac6c431e-5c6b-4f37-aff2-ac4853838032\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521185-z8j6d" Feb 16 19:45:00 crc kubenswrapper[4675]: I0216 19:45:00.414718 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ac6c431e-5c6b-4f37-aff2-ac4853838032-secret-volume\") pod \"collect-profiles-29521185-z8j6d\" (UID: \"ac6c431e-5c6b-4f37-aff2-ac4853838032\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521185-z8j6d" Feb 16 19:45:00 crc kubenswrapper[4675]: I0216 19:45:00.414751 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ac6c431e-5c6b-4f37-aff2-ac4853838032-config-volume\") pod \"collect-profiles-29521185-z8j6d\" (UID: \"ac6c431e-5c6b-4f37-aff2-ac4853838032\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521185-z8j6d" Feb 16 19:45:00 crc kubenswrapper[4675]: I0216 19:45:00.416024 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ac6c431e-5c6b-4f37-aff2-ac4853838032-config-volume\") pod \"collect-profiles-29521185-z8j6d\" (UID: \"ac6c431e-5c6b-4f37-aff2-ac4853838032\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521185-z8j6d" Feb 16 19:45:00 crc kubenswrapper[4675]: I0216 19:45:00.433025 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ac6c431e-5c6b-4f37-aff2-ac4853838032-secret-volume\") pod \"collect-profiles-29521185-z8j6d\" (UID: \"ac6c431e-5c6b-4f37-aff2-ac4853838032\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521185-z8j6d" Feb 16 19:45:00 crc kubenswrapper[4675]: I0216 19:45:00.433953 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96xq9\" (UniqueName: \"kubernetes.io/projected/ac6c431e-5c6b-4f37-aff2-ac4853838032-kube-api-access-96xq9\") pod \"collect-profiles-29521185-z8j6d\" (UID: \"ac6c431e-5c6b-4f37-aff2-ac4853838032\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521185-z8j6d" Feb 16 19:45:00 crc kubenswrapper[4675]: I0216 19:45:00.472583 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29521185-z8j6d" Feb 16 19:45:01 crc kubenswrapper[4675]: I0216 19:45:01.595256 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-n5h4j" Feb 16 19:45:03 crc kubenswrapper[4675]: E0216 19:45:03.536967 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-grmd4" podUID="ee9cd13e-3dd5-4fbf-8989-965350dff2e8" Feb 16 19:45:03 crc kubenswrapper[4675]: E0216 19:45:03.640409 4675 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Feb 16 19:45:03 crc kubenswrapper[4675]: E0216 19:45:03.641139 4675 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x2lsl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-kfrnk_openshift-marketplace(a833f00b-0fcb-416e-be3e-c4344adeef8d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 16 19:45:03 crc kubenswrapper[4675]: E0216 19:45:03.643056 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-kfrnk" podUID="a833f00b-0fcb-416e-be3e-c4344adeef8d" Feb 16 19:45:03 crc kubenswrapper[4675]: E0216 19:45:03.648996 4675 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Feb 16 19:45:03 crc kubenswrapper[4675]: E0216 19:45:03.649199 4675 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-z85nv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-nqmcd_openshift-marketplace(96436f43-1dfe-4bba-98d5-ee0f45f78415): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 16 19:45:03 crc kubenswrapper[4675]: E0216 19:45:03.650434 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-nqmcd" podUID="96436f43-1dfe-4bba-98d5-ee0f45f78415" Feb 16 19:45:04 crc kubenswrapper[4675]: E0216 19:45:04.915033 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-kfrnk" podUID="a833f00b-0fcb-416e-be3e-c4344adeef8d" Feb 16 19:45:04 crc kubenswrapper[4675]: E0216 19:45:04.915127 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-nqmcd" podUID="96436f43-1dfe-4bba-98d5-ee0f45f78415" Feb 16 19:45:04 crc kubenswrapper[4675]: E0216 19:45:04.989542 4675 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 16 19:45:04 crc kubenswrapper[4675]: E0216 19:45:04.989748 4675 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-j8q2r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-ndhwt_openshift-marketplace(fecbc375-b333-42c1-bff9-3d17abec2eb2): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 16 19:45:04 crc kubenswrapper[4675]: E0216 19:45:04.991221 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-ndhwt" podUID="fecbc375-b333-42c1-bff9-3d17abec2eb2" Feb 16 19:45:05 crc kubenswrapper[4675]: E0216 19:45:05.017336 4675 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 16 19:45:05 crc kubenswrapper[4675]: E0216 19:45:05.017518 4675 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kgvmx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-gfhrq_openshift-marketplace(c1ac438d-9403-455d-869e-7d9cc34cf15f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 16 19:45:05 crc kubenswrapper[4675]: E0216 19:45:05.018983 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-gfhrq" podUID="c1ac438d-9403-455d-869e-7d9cc34cf15f" Feb 16 19:45:06 crc kubenswrapper[4675]: I0216 19:45:06.247572 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 16 19:45:06 crc kubenswrapper[4675]: I0216 19:45:06.248752 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 16 19:45:06 crc kubenswrapper[4675]: I0216 19:45:06.253037 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 16 19:45:06 crc kubenswrapper[4675]: I0216 19:45:06.259529 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 16 19:45:06 crc kubenswrapper[4675]: I0216 19:45:06.260083 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 16 19:45:06 crc kubenswrapper[4675]: I0216 19:45:06.306417 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2f321a9e-46e1-41a9-a12f-799a118215a1-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"2f321a9e-46e1-41a9-a12f-799a118215a1\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 16 19:45:06 crc kubenswrapper[4675]: I0216 19:45:06.306469 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2f321a9e-46e1-41a9-a12f-799a118215a1-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"2f321a9e-46e1-41a9-a12f-799a118215a1\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 16 19:45:06 crc kubenswrapper[4675]: E0216 19:45:06.333276 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-ndhwt" podUID="fecbc375-b333-42c1-bff9-3d17abec2eb2" Feb 16 19:45:06 crc kubenswrapper[4675]: E0216 19:45:06.333315 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-gfhrq" podUID="c1ac438d-9403-455d-869e-7d9cc34cf15f" Feb 16 19:45:06 crc kubenswrapper[4675]: I0216 19:45:06.407954 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2f321a9e-46e1-41a9-a12f-799a118215a1-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"2f321a9e-46e1-41a9-a12f-799a118215a1\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 16 19:45:06 crc kubenswrapper[4675]: I0216 19:45:06.408017 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2f321a9e-46e1-41a9-a12f-799a118215a1-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"2f321a9e-46e1-41a9-a12f-799a118215a1\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 16 19:45:06 crc kubenswrapper[4675]: I0216 19:45:06.408142 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2f321a9e-46e1-41a9-a12f-799a118215a1-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"2f321a9e-46e1-41a9-a12f-799a118215a1\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 16 19:45:06 crc kubenswrapper[4675]: E0216 19:45:06.429510 4675 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 16 19:45:06 crc kubenswrapper[4675]: I0216 19:45:06.430270 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2f321a9e-46e1-41a9-a12f-799a118215a1-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"2f321a9e-46e1-41a9-a12f-799a118215a1\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 16 19:45:06 crc kubenswrapper[4675]: E0216 19:45:06.431915 4675 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x5q59,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-h6msp_openshift-marketplace(eb2e7e26-8ed8-4cec-8f78-7d02cce6bb33): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 16 19:45:06 crc kubenswrapper[4675]: E0216 19:45:06.433163 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-h6msp" podUID="eb2e7e26-8ed8-4cec-8f78-7d02cce6bb33" Feb 16 19:45:06 crc kubenswrapper[4675]: E0216 19:45:06.436588 4675 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 16 19:45:06 crc kubenswrapper[4675]: E0216 19:45:06.436769 4675 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ccc57,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-bbn8r_openshift-marketplace(edd3f360-2ab8-4310-ac1a-3473a48bd5ae): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 16 19:45:06 crc kubenswrapper[4675]: E0216 19:45:06.437951 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-bbn8r" podUID="edd3f360-2ab8-4310-ac1a-3473a48bd5ae" Feb 16 19:45:06 crc kubenswrapper[4675]: E0216 19:45:06.525292 4675 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 16 19:45:06 crc kubenswrapper[4675]: E0216 19:45:06.525471 4675 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sd6tl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-4p9pw_openshift-marketplace(dd4b563d-e717-4e75-9a4f-f069626a819a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 16 19:45:06 crc kubenswrapper[4675]: E0216 19:45:06.526720 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-4p9pw" podUID="dd4b563d-e717-4e75-9a4f-f069626a819a" Feb 16 19:45:06 crc kubenswrapper[4675]: I0216 19:45:06.548013 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-6k7j5" event={"ID":"5d5f11f1-66e4-4700-9c37-f3843d1769a5","Type":"ContainerStarted","Data":"2b4275ff7d56e02e62a8ff700c1b8501ca3e696ce786f9dce9ff6b69497ca11c"} Feb 16 19:45:06 crc kubenswrapper[4675]: I0216 19:45:06.549300 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-6k7j5" Feb 16 19:45:06 crc kubenswrapper[4675]: I0216 19:45:06.549597 4675 patch_prober.go:28] interesting pod/downloads-7954f5f757-6k7j5 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.33:8080/\": dial tcp 10.217.0.33:8080: connect: connection refused" start-of-body= Feb 16 19:45:06 crc kubenswrapper[4675]: I0216 19:45:06.549631 4675 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-6k7j5" podUID="5d5f11f1-66e4-4700-9c37-f3843d1769a5" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.33:8080/\": dial tcp 10.217.0.33:8080: connect: connection refused" Feb 16 19:45:06 crc kubenswrapper[4675]: E0216 19:45:06.551171 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-h6msp" podUID="eb2e7e26-8ed8-4cec-8f78-7d02cce6bb33" Feb 16 19:45:06 crc kubenswrapper[4675]: E0216 19:45:06.551345 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-4p9pw" podUID="dd4b563d-e717-4e75-9a4f-f069626a819a" Feb 16 19:45:06 crc kubenswrapper[4675]: E0216 19:45:06.551378 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-bbn8r" podUID="edd3f360-2ab8-4310-ac1a-3473a48bd5ae" Feb 16 19:45:06 crc kubenswrapper[4675]: I0216 19:45:06.592909 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 16 19:45:06 crc kubenswrapper[4675]: I0216 19:45:06.801881 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-sbgjb"] Feb 16 19:45:06 crc kubenswrapper[4675]: W0216 19:45:06.811587 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d5a5b47_38a4_4f7e_b40e_dba4825e18be.slice/crio-5f5123ef58d9ea5dddd67a35623e1b84512b64b7ea8d1ff0054f08c5aa6856c6 WatchSource:0}: Error finding container 5f5123ef58d9ea5dddd67a35623e1b84512b64b7ea8d1ff0054f08c5aa6856c6: Status 404 returned error can't find the container with id 5f5123ef58d9ea5dddd67a35623e1b84512b64b7ea8d1ff0054f08c5aa6856c6 Feb 16 19:45:06 crc kubenswrapper[4675]: I0216 19:45:06.920359 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 19:45:06 crc kubenswrapper[4675]: W0216 19:45:06.925139 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac6c431e_5c6b_4f37_aff2_ac4853838032.slice/crio-e7cc6e94b77f96a886834351544819a3060c1ece85e1bb81d30491d952fba33a WatchSource:0}: Error finding container e7cc6e94b77f96a886834351544819a3060c1ece85e1bb81d30491d952fba33a: Status 404 returned error can't find the container with id e7cc6e94b77f96a886834351544819a3060c1ece85e1bb81d30491d952fba33a Feb 16 19:45:06 crc kubenswrapper[4675]: I0216 19:45:06.931163 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29521185-z8j6d"] Feb 16 19:45:07 crc kubenswrapper[4675]: I0216 19:45:07.069703 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 16 19:45:07 crc kubenswrapper[4675]: W0216 19:45:07.078560 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod2f321a9e_46e1_41a9_a12f_799a118215a1.slice/crio-cd974b63961ffaab1198bd92b757ea4b135d3b946b9c6bd34397f0155904b9e0 WatchSource:0}: Error finding container cd974b63961ffaab1198bd92b757ea4b135d3b946b9c6bd34397f0155904b9e0: Status 404 returned error can't find the container with id cd974b63961ffaab1198bd92b757ea4b135d3b946b9c6bd34397f0155904b9e0 Feb 16 19:45:07 crc kubenswrapper[4675]: I0216 19:45:07.558528 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-sbgjb" event={"ID":"8d5a5b47-38a4-4f7e-b40e-dba4825e18be","Type":"ContainerStarted","Data":"df2237009c40cd82e7576bb7313b158ebdd1d9f2219be3b5202beec6980d28ac"} Feb 16 19:45:07 crc kubenswrapper[4675]: I0216 19:45:07.558750 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-sbgjb" event={"ID":"8d5a5b47-38a4-4f7e-b40e-dba4825e18be","Type":"ContainerStarted","Data":"5f5123ef58d9ea5dddd67a35623e1b84512b64b7ea8d1ff0054f08c5aa6856c6"} Feb 16 19:45:07 crc kubenswrapper[4675]: I0216 19:45:07.559670 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"2f321a9e-46e1-41a9-a12f-799a118215a1","Type":"ContainerStarted","Data":"cd974b63961ffaab1198bd92b757ea4b135d3b946b9c6bd34397f0155904b9e0"} Feb 16 19:45:07 crc kubenswrapper[4675]: I0216 19:45:07.564216 4675 generic.go:334] "Generic (PLEG): container finished" podID="ac6c431e-5c6b-4f37-aff2-ac4853838032" containerID="79e06a93a171d8693176182a68e6f0a97021c499bd5d462c66ac50ac31f37b5e" exitCode=0 Feb 16 19:45:07 crc kubenswrapper[4675]: I0216 19:45:07.564492 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29521185-z8j6d" event={"ID":"ac6c431e-5c6b-4f37-aff2-ac4853838032","Type":"ContainerDied","Data":"79e06a93a171d8693176182a68e6f0a97021c499bd5d462c66ac50ac31f37b5e"} Feb 16 19:45:07 crc kubenswrapper[4675]: I0216 19:45:07.564563 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29521185-z8j6d" event={"ID":"ac6c431e-5c6b-4f37-aff2-ac4853838032","Type":"ContainerStarted","Data":"e7cc6e94b77f96a886834351544819a3060c1ece85e1bb81d30491d952fba33a"} Feb 16 19:45:07 crc kubenswrapper[4675]: I0216 19:45:07.565956 4675 patch_prober.go:28] interesting pod/downloads-7954f5f757-6k7j5 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.33:8080/\": dial tcp 10.217.0.33:8080: connect: connection refused" start-of-body= Feb 16 19:45:07 crc kubenswrapper[4675]: I0216 19:45:07.566050 4675 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-6k7j5" podUID="5d5f11f1-66e4-4700-9c37-f3843d1769a5" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.33:8080/\": dial tcp 10.217.0.33:8080: connect: connection refused" Feb 16 19:45:08 crc kubenswrapper[4675]: I0216 19:45:08.572300 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-sbgjb" event={"ID":"8d5a5b47-38a4-4f7e-b40e-dba4825e18be","Type":"ContainerStarted","Data":"ff2f10f7d6d879b7014135d443d2f451bbf3bf95c6c7f836c4a70f54d62ce8f0"} Feb 16 19:45:08 crc kubenswrapper[4675]: I0216 19:45:08.583517 4675 generic.go:334] "Generic (PLEG): container finished" podID="2f321a9e-46e1-41a9-a12f-799a118215a1" containerID="be02a6da67faf41a443cc7366fdbf79fd0879015b4fd3f17b17f49768240902f" exitCode=0 Feb 16 19:45:08 crc kubenswrapper[4675]: I0216 19:45:08.584071 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"2f321a9e-46e1-41a9-a12f-799a118215a1","Type":"ContainerDied","Data":"be02a6da67faf41a443cc7366fdbf79fd0879015b4fd3f17b17f49768240902f"} Feb 16 19:45:08 crc kubenswrapper[4675]: I0216 19:45:08.586999 4675 patch_prober.go:28] interesting pod/downloads-7954f5f757-6k7j5 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.33:8080/\": dial tcp 10.217.0.33:8080: connect: connection refused" start-of-body= Feb 16 19:45:08 crc kubenswrapper[4675]: I0216 19:45:08.587094 4675 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-6k7j5" podUID="5d5f11f1-66e4-4700-9c37-f3843d1769a5" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.33:8080/\": dial tcp 10.217.0.33:8080: connect: connection refused" Feb 16 19:45:08 crc kubenswrapper[4675]: I0216 19:45:08.592087 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-sbgjb" podStartSLOduration=168.592065342 podStartE2EDuration="2m48.592065342s" podCreationTimestamp="2026-02-16 19:42:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 19:45:08.589014772 +0000 UTC m=+191.714304338" watchObservedRunningTime="2026-02-16 19:45:08.592065342 +0000 UTC m=+191.717354918" Feb 16 19:45:09 crc kubenswrapper[4675]: I0216 19:45:09.032839 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29521185-z8j6d" Feb 16 19:45:09 crc kubenswrapper[4675]: I0216 19:45:09.155231 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ac6c431e-5c6b-4f37-aff2-ac4853838032-secret-volume\") pod \"ac6c431e-5c6b-4f37-aff2-ac4853838032\" (UID: \"ac6c431e-5c6b-4f37-aff2-ac4853838032\") " Feb 16 19:45:09 crc kubenswrapper[4675]: I0216 19:45:09.155386 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-96xq9\" (UniqueName: \"kubernetes.io/projected/ac6c431e-5c6b-4f37-aff2-ac4853838032-kube-api-access-96xq9\") pod \"ac6c431e-5c6b-4f37-aff2-ac4853838032\" (UID: \"ac6c431e-5c6b-4f37-aff2-ac4853838032\") " Feb 16 19:45:09 crc kubenswrapper[4675]: I0216 19:45:09.155444 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ac6c431e-5c6b-4f37-aff2-ac4853838032-config-volume\") pod \"ac6c431e-5c6b-4f37-aff2-ac4853838032\" (UID: \"ac6c431e-5c6b-4f37-aff2-ac4853838032\") " Feb 16 19:45:09 crc kubenswrapper[4675]: I0216 19:45:09.156657 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac6c431e-5c6b-4f37-aff2-ac4853838032-config-volume" (OuterVolumeSpecName: "config-volume") pod "ac6c431e-5c6b-4f37-aff2-ac4853838032" (UID: "ac6c431e-5c6b-4f37-aff2-ac4853838032"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 19:45:09 crc kubenswrapper[4675]: I0216 19:45:09.157317 4675 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ac6c431e-5c6b-4f37-aff2-ac4853838032-config-volume\") on node \"crc\" DevicePath \"\"" Feb 16 19:45:09 crc kubenswrapper[4675]: I0216 19:45:09.164838 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac6c431e-5c6b-4f37-aff2-ac4853838032-kube-api-access-96xq9" (OuterVolumeSpecName: "kube-api-access-96xq9") pod "ac6c431e-5c6b-4f37-aff2-ac4853838032" (UID: "ac6c431e-5c6b-4f37-aff2-ac4853838032"). InnerVolumeSpecName "kube-api-access-96xq9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 19:45:09 crc kubenswrapper[4675]: I0216 19:45:09.166819 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac6c431e-5c6b-4f37-aff2-ac4853838032-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ac6c431e-5c6b-4f37-aff2-ac4853838032" (UID: "ac6c431e-5c6b-4f37-aff2-ac4853838032"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 19:45:09 crc kubenswrapper[4675]: I0216 19:45:09.258152 4675 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ac6c431e-5c6b-4f37-aff2-ac4853838032-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 16 19:45:09 crc kubenswrapper[4675]: I0216 19:45:09.258192 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-96xq9\" (UniqueName: \"kubernetes.io/projected/ac6c431e-5c6b-4f37-aff2-ac4853838032-kube-api-access-96xq9\") on node \"crc\" DevicePath \"\"" Feb 16 19:45:09 crc kubenswrapper[4675]: I0216 19:45:09.593676 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29521185-z8j6d" event={"ID":"ac6c431e-5c6b-4f37-aff2-ac4853838032","Type":"ContainerDied","Data":"e7cc6e94b77f96a886834351544819a3060c1ece85e1bb81d30491d952fba33a"} Feb 16 19:45:09 crc kubenswrapper[4675]: I0216 19:45:09.593758 4675 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e7cc6e94b77f96a886834351544819a3060c1ece85e1bb81d30491d952fba33a" Feb 16 19:45:09 crc kubenswrapper[4675]: I0216 19:45:09.593827 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29521185-z8j6d" Feb 16 19:45:09 crc kubenswrapper[4675]: I0216 19:45:09.901221 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 16 19:45:10 crc kubenswrapper[4675]: I0216 19:45:10.066780 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2f321a9e-46e1-41a9-a12f-799a118215a1-kube-api-access\") pod \"2f321a9e-46e1-41a9-a12f-799a118215a1\" (UID: \"2f321a9e-46e1-41a9-a12f-799a118215a1\") " Feb 16 19:45:10 crc kubenswrapper[4675]: I0216 19:45:10.066890 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2f321a9e-46e1-41a9-a12f-799a118215a1-kubelet-dir\") pod \"2f321a9e-46e1-41a9-a12f-799a118215a1\" (UID: \"2f321a9e-46e1-41a9-a12f-799a118215a1\") " Feb 16 19:45:10 crc kubenswrapper[4675]: I0216 19:45:10.067137 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2f321a9e-46e1-41a9-a12f-799a118215a1-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "2f321a9e-46e1-41a9-a12f-799a118215a1" (UID: "2f321a9e-46e1-41a9-a12f-799a118215a1"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 16 19:45:10 crc kubenswrapper[4675]: I0216 19:45:10.067369 4675 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2f321a9e-46e1-41a9-a12f-799a118215a1-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 16 19:45:10 crc kubenswrapper[4675]: I0216 19:45:10.077032 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f321a9e-46e1-41a9-a12f-799a118215a1-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "2f321a9e-46e1-41a9-a12f-799a118215a1" (UID: "2f321a9e-46e1-41a9-a12f-799a118215a1"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 19:45:10 crc kubenswrapper[4675]: I0216 19:45:10.119521 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-55l96"] Feb 16 19:45:10 crc kubenswrapper[4675]: I0216 19:45:10.169379 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2f321a9e-46e1-41a9-a12f-799a118215a1-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 16 19:45:10 crc kubenswrapper[4675]: I0216 19:45:10.375906 4675 patch_prober.go:28] interesting pod/downloads-7954f5f757-6k7j5 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.33:8080/\": dial tcp 10.217.0.33:8080: connect: connection refused" start-of-body= Feb 16 19:45:10 crc kubenswrapper[4675]: I0216 19:45:10.376217 4675 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-6k7j5" podUID="5d5f11f1-66e4-4700-9c37-f3843d1769a5" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.33:8080/\": dial tcp 10.217.0.33:8080: connect: connection refused" Feb 16 19:45:10 crc kubenswrapper[4675]: I0216 19:45:10.375905 4675 patch_prober.go:28] interesting pod/downloads-7954f5f757-6k7j5 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.33:8080/\": dial tcp 10.217.0.33:8080: connect: connection refused" start-of-body= Feb 16 19:45:10 crc kubenswrapper[4675]: I0216 19:45:10.376268 4675 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-6k7j5" podUID="5d5f11f1-66e4-4700-9c37-f3843d1769a5" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.33:8080/\": dial tcp 10.217.0.33:8080: connect: connection refused" Feb 16 19:45:10 crc kubenswrapper[4675]: I0216 19:45:10.601510 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"2f321a9e-46e1-41a9-a12f-799a118215a1","Type":"ContainerDied","Data":"cd974b63961ffaab1198bd92b757ea4b135d3b946b9c6bd34397f0155904b9e0"} Feb 16 19:45:10 crc kubenswrapper[4675]: I0216 19:45:10.602641 4675 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cd974b63961ffaab1198bd92b757ea4b135d3b946b9c6bd34397f0155904b9e0" Feb 16 19:45:10 crc kubenswrapper[4675]: I0216 19:45:10.601588 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 16 19:45:10 crc kubenswrapper[4675]: I0216 19:45:10.845998 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 16 19:45:10 crc kubenswrapper[4675]: E0216 19:45:10.850738 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac6c431e-5c6b-4f37-aff2-ac4853838032" containerName="collect-profiles" Feb 16 19:45:10 crc kubenswrapper[4675]: I0216 19:45:10.850779 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac6c431e-5c6b-4f37-aff2-ac4853838032" containerName="collect-profiles" Feb 16 19:45:10 crc kubenswrapper[4675]: E0216 19:45:10.850815 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f321a9e-46e1-41a9-a12f-799a118215a1" containerName="pruner" Feb 16 19:45:10 crc kubenswrapper[4675]: I0216 19:45:10.850824 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f321a9e-46e1-41a9-a12f-799a118215a1" containerName="pruner" Feb 16 19:45:10 crc kubenswrapper[4675]: I0216 19:45:10.851013 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac6c431e-5c6b-4f37-aff2-ac4853838032" containerName="collect-profiles" Feb 16 19:45:10 crc kubenswrapper[4675]: I0216 19:45:10.851033 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f321a9e-46e1-41a9-a12f-799a118215a1" containerName="pruner" Feb 16 19:45:10 crc kubenswrapper[4675]: I0216 19:45:10.857476 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 16 19:45:10 crc kubenswrapper[4675]: I0216 19:45:10.857754 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 16 19:45:10 crc kubenswrapper[4675]: I0216 19:45:10.863903 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 16 19:45:10 crc kubenswrapper[4675]: I0216 19:45:10.865056 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 16 19:45:10 crc kubenswrapper[4675]: I0216 19:45:10.880827 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d51e99ec-7a52-4a98-83b0-9f3948b3c957-kubelet-dir\") pod \"installer-9-crc\" (UID: \"d51e99ec-7a52-4a98-83b0-9f3948b3c957\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 16 19:45:10 crc kubenswrapper[4675]: I0216 19:45:10.880887 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d51e99ec-7a52-4a98-83b0-9f3948b3c957-kube-api-access\") pod \"installer-9-crc\" (UID: \"d51e99ec-7a52-4a98-83b0-9f3948b3c957\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 16 19:45:10 crc kubenswrapper[4675]: I0216 19:45:10.880976 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d51e99ec-7a52-4a98-83b0-9f3948b3c957-var-lock\") pod \"installer-9-crc\" (UID: \"d51e99ec-7a52-4a98-83b0-9f3948b3c957\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 16 19:45:10 crc kubenswrapper[4675]: I0216 19:45:10.983093 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d51e99ec-7a52-4a98-83b0-9f3948b3c957-var-lock\") pod \"installer-9-crc\" (UID: \"d51e99ec-7a52-4a98-83b0-9f3948b3c957\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 16 19:45:10 crc kubenswrapper[4675]: I0216 19:45:10.983196 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d51e99ec-7a52-4a98-83b0-9f3948b3c957-kubelet-dir\") pod \"installer-9-crc\" (UID: \"d51e99ec-7a52-4a98-83b0-9f3948b3c957\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 16 19:45:10 crc kubenswrapper[4675]: I0216 19:45:10.983230 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d51e99ec-7a52-4a98-83b0-9f3948b3c957-kube-api-access\") pod \"installer-9-crc\" (UID: \"d51e99ec-7a52-4a98-83b0-9f3948b3c957\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 16 19:45:10 crc kubenswrapper[4675]: I0216 19:45:10.983674 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d51e99ec-7a52-4a98-83b0-9f3948b3c957-var-lock\") pod \"installer-9-crc\" (UID: \"d51e99ec-7a52-4a98-83b0-9f3948b3c957\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 16 19:45:10 crc kubenswrapper[4675]: I0216 19:45:10.983738 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d51e99ec-7a52-4a98-83b0-9f3948b3c957-kubelet-dir\") pod \"installer-9-crc\" (UID: \"d51e99ec-7a52-4a98-83b0-9f3948b3c957\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 16 19:45:11 crc kubenswrapper[4675]: I0216 19:45:11.004592 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d51e99ec-7a52-4a98-83b0-9f3948b3c957-kube-api-access\") pod \"installer-9-crc\" (UID: \"d51e99ec-7a52-4a98-83b0-9f3948b3c957\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 16 19:45:11 crc kubenswrapper[4675]: I0216 19:45:11.192877 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 16 19:45:11 crc kubenswrapper[4675]: I0216 19:45:11.697721 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 16 19:45:12 crc kubenswrapper[4675]: I0216 19:45:12.616203 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"d51e99ec-7a52-4a98-83b0-9f3948b3c957","Type":"ContainerStarted","Data":"8dfb8f83a9674f93466547d8541922a76d7f9348c01c7828013924c816927b43"} Feb 16 19:45:12 crc kubenswrapper[4675]: I0216 19:45:12.617464 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"d51e99ec-7a52-4a98-83b0-9f3948b3c957","Type":"ContainerStarted","Data":"1d6578f181c23d21fb6c24bec6519610d46af0693f9f6f7c5bbcc6b939f9f692"} Feb 16 19:45:12 crc kubenswrapper[4675]: I0216 19:45:12.641834 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.641805372 podStartE2EDuration="2.641805372s" podCreationTimestamp="2026-02-16 19:45:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 19:45:12.640669072 +0000 UTC m=+195.765958638" watchObservedRunningTime="2026-02-16 19:45:12.641805372 +0000 UTC m=+195.767094928" Feb 16 19:45:17 crc kubenswrapper[4675]: I0216 19:45:17.554302 4675 patch_prober.go:28] interesting pod/machine-config-daemon-j7pnb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 16 19:45:17 crc kubenswrapper[4675]: I0216 19:45:17.555111 4675 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" podUID="10414964-83d0-4d95-a89f-e3212a8015b5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 16 19:45:20 crc kubenswrapper[4675]: I0216 19:45:20.385803 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-6k7j5" Feb 16 19:45:21 crc kubenswrapper[4675]: I0216 19:45:21.670459 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nqmcd" event={"ID":"96436f43-1dfe-4bba-98d5-ee0f45f78415","Type":"ContainerStarted","Data":"fb2c8d0e9d900a1e511afdb320a9532c52ac05e656217e8d076fff94db24638f"} Feb 16 19:45:23 crc kubenswrapper[4675]: I0216 19:45:23.701034 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-grmd4" event={"ID":"ee9cd13e-3dd5-4fbf-8989-965350dff2e8","Type":"ContainerStarted","Data":"2107c3931e2a211169fc7be814dd3700e16689ca03a14512561e275df538ea93"} Feb 16 19:45:23 crc kubenswrapper[4675]: I0216 19:45:23.703833 4675 generic.go:334] "Generic (PLEG): container finished" podID="96436f43-1dfe-4bba-98d5-ee0f45f78415" containerID="fb2c8d0e9d900a1e511afdb320a9532c52ac05e656217e8d076fff94db24638f" exitCode=0 Feb 16 19:45:23 crc kubenswrapper[4675]: I0216 19:45:23.703909 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nqmcd" event={"ID":"96436f43-1dfe-4bba-98d5-ee0f45f78415","Type":"ContainerDied","Data":"fb2c8d0e9d900a1e511afdb320a9532c52ac05e656217e8d076fff94db24638f"} Feb 16 19:45:23 crc kubenswrapper[4675]: I0216 19:45:23.708909 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4p9pw" event={"ID":"dd4b563d-e717-4e75-9a4f-f069626a819a","Type":"ContainerStarted","Data":"9e1b8b847061a07134ec15d7fce2076205481b89fc8ba62c42c02d0dce98c97e"} Feb 16 19:45:23 crc kubenswrapper[4675]: I0216 19:45:23.717301 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ndhwt" event={"ID":"fecbc375-b333-42c1-bff9-3d17abec2eb2","Type":"ContainerStarted","Data":"8ceb1340194ea6c535d30d3a8e72737b9761d312c8e99f72a3c8e8e9edf5e015"} Feb 16 19:45:23 crc kubenswrapper[4675]: I0216 19:45:23.720578 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kfrnk" event={"ID":"a833f00b-0fcb-416e-be3e-c4344adeef8d","Type":"ContainerStarted","Data":"555ca99dd51df738bc403af046e1a4cfeabd5180e6ffa43b503b78207bba9337"} Feb 16 19:45:23 crc kubenswrapper[4675]: I0216 19:45:23.723511 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bbn8r" event={"ID":"edd3f360-2ab8-4310-ac1a-3473a48bd5ae","Type":"ContainerStarted","Data":"e7c213a51a79b37eaf84b9f493031f15913047e95fb3c30fc2937779ba69353b"} Feb 16 19:45:23 crc kubenswrapper[4675]: I0216 19:45:23.726110 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gfhrq" event={"ID":"c1ac438d-9403-455d-869e-7d9cc34cf15f","Type":"ContainerStarted","Data":"184b8f30ad8b6247137a807a2fd588f707837c803aeb6409ff81125a5a2e3ba5"} Feb 16 19:45:23 crc kubenswrapper[4675]: I0216 19:45:23.731062 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h6msp" event={"ID":"eb2e7e26-8ed8-4cec-8f78-7d02cce6bb33","Type":"ContainerStarted","Data":"71838788e4e8f4cd2516ebd1f6177d4353440a42c2162854b2540ebf51eeb1b2"} Feb 16 19:45:24 crc kubenswrapper[4675]: I0216 19:45:24.745836 4675 generic.go:334] "Generic (PLEG): container finished" podID="c1ac438d-9403-455d-869e-7d9cc34cf15f" containerID="184b8f30ad8b6247137a807a2fd588f707837c803aeb6409ff81125a5a2e3ba5" exitCode=0 Feb 16 19:45:24 crc kubenswrapper[4675]: I0216 19:45:24.745981 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gfhrq" event={"ID":"c1ac438d-9403-455d-869e-7d9cc34cf15f","Type":"ContainerDied","Data":"184b8f30ad8b6247137a807a2fd588f707837c803aeb6409ff81125a5a2e3ba5"} Feb 16 19:45:24 crc kubenswrapper[4675]: I0216 19:45:24.750863 4675 generic.go:334] "Generic (PLEG): container finished" podID="eb2e7e26-8ed8-4cec-8f78-7d02cce6bb33" containerID="71838788e4e8f4cd2516ebd1f6177d4353440a42c2162854b2540ebf51eeb1b2" exitCode=0 Feb 16 19:45:24 crc kubenswrapper[4675]: I0216 19:45:24.750939 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h6msp" event={"ID":"eb2e7e26-8ed8-4cec-8f78-7d02cce6bb33","Type":"ContainerDied","Data":"71838788e4e8f4cd2516ebd1f6177d4353440a42c2162854b2540ebf51eeb1b2"} Feb 16 19:45:24 crc kubenswrapper[4675]: I0216 19:45:24.754709 4675 generic.go:334] "Generic (PLEG): container finished" podID="ee9cd13e-3dd5-4fbf-8989-965350dff2e8" containerID="2107c3931e2a211169fc7be814dd3700e16689ca03a14512561e275df538ea93" exitCode=0 Feb 16 19:45:24 crc kubenswrapper[4675]: I0216 19:45:24.754757 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-grmd4" event={"ID":"ee9cd13e-3dd5-4fbf-8989-965350dff2e8","Type":"ContainerDied","Data":"2107c3931e2a211169fc7be814dd3700e16689ca03a14512561e275df538ea93"} Feb 16 19:45:24 crc kubenswrapper[4675]: I0216 19:45:24.758671 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nqmcd" event={"ID":"96436f43-1dfe-4bba-98d5-ee0f45f78415","Type":"ContainerStarted","Data":"0b3c8c43fb4f6ed25f069d5090f2a09e29cd1b2fb61055470c6d56117dc1560e"} Feb 16 19:45:24 crc kubenswrapper[4675]: I0216 19:45:24.762867 4675 generic.go:334] "Generic (PLEG): container finished" podID="dd4b563d-e717-4e75-9a4f-f069626a819a" containerID="9e1b8b847061a07134ec15d7fce2076205481b89fc8ba62c42c02d0dce98c97e" exitCode=0 Feb 16 19:45:24 crc kubenswrapper[4675]: I0216 19:45:24.762946 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4p9pw" event={"ID":"dd4b563d-e717-4e75-9a4f-f069626a819a","Type":"ContainerDied","Data":"9e1b8b847061a07134ec15d7fce2076205481b89fc8ba62c42c02d0dce98c97e"} Feb 16 19:45:24 crc kubenswrapper[4675]: I0216 19:45:24.769723 4675 generic.go:334] "Generic (PLEG): container finished" podID="fecbc375-b333-42c1-bff9-3d17abec2eb2" containerID="8ceb1340194ea6c535d30d3a8e72737b9761d312c8e99f72a3c8e8e9edf5e015" exitCode=0 Feb 16 19:45:24 crc kubenswrapper[4675]: I0216 19:45:24.769766 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ndhwt" event={"ID":"fecbc375-b333-42c1-bff9-3d17abec2eb2","Type":"ContainerDied","Data":"8ceb1340194ea6c535d30d3a8e72737b9761d312c8e99f72a3c8e8e9edf5e015"} Feb 16 19:45:24 crc kubenswrapper[4675]: I0216 19:45:24.773471 4675 generic.go:334] "Generic (PLEG): container finished" podID="edd3f360-2ab8-4310-ac1a-3473a48bd5ae" containerID="e7c213a51a79b37eaf84b9f493031f15913047e95fb3c30fc2937779ba69353b" exitCode=0 Feb 16 19:45:24 crc kubenswrapper[4675]: I0216 19:45:24.773515 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bbn8r" event={"ID":"edd3f360-2ab8-4310-ac1a-3473a48bd5ae","Type":"ContainerDied","Data":"e7c213a51a79b37eaf84b9f493031f15913047e95fb3c30fc2937779ba69353b"} Feb 16 19:45:24 crc kubenswrapper[4675]: I0216 19:45:24.791507 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-nqmcd" podStartSLOduration=2.814254717 podStartE2EDuration="52.791454441s" podCreationTimestamp="2026-02-16 19:44:32 +0000 UTC" firstStartedPulling="2026-02-16 19:44:34.11806755 +0000 UTC m=+157.243357106" lastFinishedPulling="2026-02-16 19:45:24.095267274 +0000 UTC m=+207.220556830" observedRunningTime="2026-02-16 19:45:24.787817246 +0000 UTC m=+207.913106812" watchObservedRunningTime="2026-02-16 19:45:24.791454441 +0000 UTC m=+207.916743997" Feb 16 19:45:25 crc kubenswrapper[4675]: I0216 19:45:25.780806 4675 generic.go:334] "Generic (PLEG): container finished" podID="a833f00b-0fcb-416e-be3e-c4344adeef8d" containerID="555ca99dd51df738bc403af046e1a4cfeabd5180e6ffa43b503b78207bba9337" exitCode=0 Feb 16 19:45:25 crc kubenswrapper[4675]: I0216 19:45:25.780894 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kfrnk" event={"ID":"a833f00b-0fcb-416e-be3e-c4344adeef8d","Type":"ContainerDied","Data":"555ca99dd51df738bc403af046e1a4cfeabd5180e6ffa43b503b78207bba9337"} Feb 16 19:45:25 crc kubenswrapper[4675]: I0216 19:45:25.785882 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bbn8r" event={"ID":"edd3f360-2ab8-4310-ac1a-3473a48bd5ae","Type":"ContainerStarted","Data":"be5cf4f9a7b6873d51e9f73e7841c3ddf0ea4356b75217fcfddd827a437bff5e"} Feb 16 19:45:25 crc kubenswrapper[4675]: I0216 19:45:25.789433 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gfhrq" event={"ID":"c1ac438d-9403-455d-869e-7d9cc34cf15f","Type":"ContainerStarted","Data":"339df0b6bf520729733b7ad39e03466240548d937c312b5b81de878d085fd8c6"} Feb 16 19:45:25 crc kubenswrapper[4675]: I0216 19:45:25.792515 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h6msp" event={"ID":"eb2e7e26-8ed8-4cec-8f78-7d02cce6bb33","Type":"ContainerStarted","Data":"144ee6fd7a97b4a71b0833920c5a41a5c5d867dcfb40cb217f8fb029acc45cab"} Feb 16 19:45:25 crc kubenswrapper[4675]: I0216 19:45:25.794971 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-grmd4" event={"ID":"ee9cd13e-3dd5-4fbf-8989-965350dff2e8","Type":"ContainerStarted","Data":"877a79e2937424f39a4867f6f4a274bd649ac2396a6e38ca7c52594e7de3f2c5"} Feb 16 19:45:25 crc kubenswrapper[4675]: I0216 19:45:25.798576 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4p9pw" event={"ID":"dd4b563d-e717-4e75-9a4f-f069626a819a","Type":"ContainerStarted","Data":"a1104e844e8344246083bde5b150f6a026ab7fcd728dc72eabebf35c2df26747"} Feb 16 19:45:25 crc kubenswrapper[4675]: I0216 19:45:25.812218 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ndhwt" event={"ID":"fecbc375-b333-42c1-bff9-3d17abec2eb2","Type":"ContainerStarted","Data":"3d127274c6289a876eebbf2766615789d6b8503a82480dc02fc5374f7198afa1"} Feb 16 19:45:25 crc kubenswrapper[4675]: I0216 19:45:25.845155 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bbn8r" podStartSLOduration=3.386186902 podStartE2EDuration="56.845129655s" podCreationTimestamp="2026-02-16 19:44:29 +0000 UTC" firstStartedPulling="2026-02-16 19:44:31.981009736 +0000 UTC m=+155.106299282" lastFinishedPulling="2026-02-16 19:45:25.439952479 +0000 UTC m=+208.565242035" observedRunningTime="2026-02-16 19:45:25.840833123 +0000 UTC m=+208.966122679" watchObservedRunningTime="2026-02-16 19:45:25.845129655 +0000 UTC m=+208.970419211" Feb 16 19:45:25 crc kubenswrapper[4675]: I0216 19:45:25.869611 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-h6msp" podStartSLOduration=3.416374637 podStartE2EDuration="57.869588307s" podCreationTimestamp="2026-02-16 19:44:28 +0000 UTC" firstStartedPulling="2026-02-16 19:44:30.922201226 +0000 UTC m=+154.047490782" lastFinishedPulling="2026-02-16 19:45:25.375414896 +0000 UTC m=+208.500704452" observedRunningTime="2026-02-16 19:45:25.866922217 +0000 UTC m=+208.992211773" watchObservedRunningTime="2026-02-16 19:45:25.869588307 +0000 UTC m=+208.994877863" Feb 16 19:45:25 crc kubenswrapper[4675]: I0216 19:45:25.894679 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4p9pw" podStartSLOduration=2.708825117 podStartE2EDuration="54.894647824s" podCreationTimestamp="2026-02-16 19:44:31 +0000 UTC" firstStartedPulling="2026-02-16 19:44:33.087109552 +0000 UTC m=+156.212399108" lastFinishedPulling="2026-02-16 19:45:25.272932259 +0000 UTC m=+208.398221815" observedRunningTime="2026-02-16 19:45:25.890793893 +0000 UTC m=+209.016083459" watchObservedRunningTime="2026-02-16 19:45:25.894647824 +0000 UTC m=+209.019937400" Feb 16 19:45:25 crc kubenswrapper[4675]: I0216 19:45:25.941896 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gfhrq" podStartSLOduration=3.714247157 podStartE2EDuration="56.941869962s" podCreationTimestamp="2026-02-16 19:44:29 +0000 UTC" firstStartedPulling="2026-02-16 19:44:31.99445745 +0000 UTC m=+155.119747006" lastFinishedPulling="2026-02-16 19:45:25.222080265 +0000 UTC m=+208.347369811" observedRunningTime="2026-02-16 19:45:25.918854389 +0000 UTC m=+209.044143965" watchObservedRunningTime="2026-02-16 19:45:25.941869962 +0000 UTC m=+209.067159518" Feb 16 19:45:25 crc kubenswrapper[4675]: I0216 19:45:25.973619 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-grmd4" podStartSLOduration=3.725673509 podStartE2EDuration="57.973585114s" podCreationTimestamp="2026-02-16 19:44:28 +0000 UTC" firstStartedPulling="2026-02-16 19:44:30.934425338 +0000 UTC m=+154.059714894" lastFinishedPulling="2026-02-16 19:45:25.182336933 +0000 UTC m=+208.307626499" observedRunningTime="2026-02-16 19:45:25.939218933 +0000 UTC m=+209.064508489" watchObservedRunningTime="2026-02-16 19:45:25.973585114 +0000 UTC m=+209.098874670" Feb 16 19:45:25 crc kubenswrapper[4675]: I0216 19:45:25.975477 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ndhwt" podStartSLOduration=3.722820075 podStartE2EDuration="55.975471024s" podCreationTimestamp="2026-02-16 19:44:30 +0000 UTC" firstStartedPulling="2026-02-16 19:44:33.08740789 +0000 UTC m=+156.212697446" lastFinishedPulling="2026-02-16 19:45:25.340058839 +0000 UTC m=+208.465348395" observedRunningTime="2026-02-16 19:45:25.973265946 +0000 UTC m=+209.098555512" watchObservedRunningTime="2026-02-16 19:45:25.975471024 +0000 UTC m=+209.100760580" Feb 16 19:45:26 crc kubenswrapper[4675]: I0216 19:45:26.821850 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kfrnk" event={"ID":"a833f00b-0fcb-416e-be3e-c4344adeef8d","Type":"ContainerStarted","Data":"d49e4d698d7921dd47b5de893a643c1130a48eb65a1d3c363b895c6528033815"} Feb 16 19:45:27 crc kubenswrapper[4675]: I0216 19:45:27.874302 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kfrnk" podStartSLOduration=4.829348875 podStartE2EDuration="56.874279652s" podCreationTimestamp="2026-02-16 19:44:31 +0000 UTC" firstStartedPulling="2026-02-16 19:44:34.15716052 +0000 UTC m=+157.282450076" lastFinishedPulling="2026-02-16 19:45:26.202091297 +0000 UTC m=+209.327380853" observedRunningTime="2026-02-16 19:45:27.869277561 +0000 UTC m=+210.994567137" watchObservedRunningTime="2026-02-16 19:45:27.874279652 +0000 UTC m=+210.999569208" Feb 16 19:45:29 crc kubenswrapper[4675]: I0216 19:45:29.434291 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-h6msp" Feb 16 19:45:29 crc kubenswrapper[4675]: I0216 19:45:29.434840 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-h6msp" Feb 16 19:45:29 crc kubenswrapper[4675]: I0216 19:45:29.739061 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bbn8r" Feb 16 19:45:29 crc kubenswrapper[4675]: I0216 19:45:29.739140 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bbn8r" Feb 16 19:45:29 crc kubenswrapper[4675]: I0216 19:45:29.776152 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gfhrq" Feb 16 19:45:29 crc kubenswrapper[4675]: I0216 19:45:29.776879 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-grmd4" Feb 16 19:45:29 crc kubenswrapper[4675]: I0216 19:45:29.776952 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gfhrq" Feb 16 19:45:29 crc kubenswrapper[4675]: I0216 19:45:29.777349 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-grmd4" Feb 16 19:45:29 crc kubenswrapper[4675]: I0216 19:45:29.797017 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-h6msp" Feb 16 19:45:29 crc kubenswrapper[4675]: I0216 19:45:29.800627 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bbn8r" Feb 16 19:45:29 crc kubenswrapper[4675]: I0216 19:45:29.836063 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gfhrq" Feb 16 19:45:29 crc kubenswrapper[4675]: I0216 19:45:29.854917 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-grmd4" Feb 16 19:45:30 crc kubenswrapper[4675]: I0216 19:45:30.905529 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-grmd4" Feb 16 19:45:30 crc kubenswrapper[4675]: I0216 19:45:30.937802 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gfhrq" Feb 16 19:45:31 crc kubenswrapper[4675]: I0216 19:45:31.585320 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ndhwt" Feb 16 19:45:31 crc kubenswrapper[4675]: I0216 19:45:31.585899 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ndhwt" Feb 16 19:45:31 crc kubenswrapper[4675]: I0216 19:45:31.630151 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ndhwt" Feb 16 19:45:31 crc kubenswrapper[4675]: I0216 19:45:31.764277 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4p9pw" Feb 16 19:45:31 crc kubenswrapper[4675]: I0216 19:45:31.764334 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4p9pw" Feb 16 19:45:31 crc kubenswrapper[4675]: I0216 19:45:31.808708 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4p9pw" Feb 16 19:45:31 crc kubenswrapper[4675]: I0216 19:45:31.910603 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4p9pw" Feb 16 19:45:31 crc kubenswrapper[4675]: I0216 19:45:31.914043 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ndhwt" Feb 16 19:45:32 crc kubenswrapper[4675]: I0216 19:45:32.420369 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kfrnk" Feb 16 19:45:32 crc kubenswrapper[4675]: I0216 19:45:32.420579 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kfrnk" Feb 16 19:45:32 crc kubenswrapper[4675]: I0216 19:45:32.747937 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-nqmcd" Feb 16 19:45:32 crc kubenswrapper[4675]: I0216 19:45:32.748074 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-nqmcd" Feb 16 19:45:32 crc kubenswrapper[4675]: I0216 19:45:32.808821 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-nqmcd" Feb 16 19:45:32 crc kubenswrapper[4675]: I0216 19:45:32.918621 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-nqmcd" Feb 16 19:45:33 crc kubenswrapper[4675]: I0216 19:45:33.320338 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gfhrq"] Feb 16 19:45:33 crc kubenswrapper[4675]: I0216 19:45:33.328971 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-gfhrq" podUID="c1ac438d-9403-455d-869e-7d9cc34cf15f" containerName="registry-server" containerID="cri-o://339df0b6bf520729733b7ad39e03466240548d937c312b5b81de878d085fd8c6" gracePeriod=2 Feb 16 19:45:33 crc kubenswrapper[4675]: I0216 19:45:33.472951 4675 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-kfrnk" podUID="a833f00b-0fcb-416e-be3e-c4344adeef8d" containerName="registry-server" probeResult="failure" output=< Feb 16 19:45:33 crc kubenswrapper[4675]: timeout: failed to connect service ":50051" within 1s Feb 16 19:45:33 crc kubenswrapper[4675]: > Feb 16 19:45:33 crc kubenswrapper[4675]: I0216 19:45:33.736481 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gfhrq" Feb 16 19:45:33 crc kubenswrapper[4675]: I0216 19:45:33.883195 4675 generic.go:334] "Generic (PLEG): container finished" podID="c1ac438d-9403-455d-869e-7d9cc34cf15f" containerID="339df0b6bf520729733b7ad39e03466240548d937c312b5b81de878d085fd8c6" exitCode=0 Feb 16 19:45:33 crc kubenswrapper[4675]: I0216 19:45:33.883706 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gfhrq" Feb 16 19:45:33 crc kubenswrapper[4675]: I0216 19:45:33.891763 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1ac438d-9403-455d-869e-7d9cc34cf15f-utilities\") pod \"c1ac438d-9403-455d-869e-7d9cc34cf15f\" (UID: \"c1ac438d-9403-455d-869e-7d9cc34cf15f\") " Feb 16 19:45:33 crc kubenswrapper[4675]: I0216 19:45:33.891850 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kgvmx\" (UniqueName: \"kubernetes.io/projected/c1ac438d-9403-455d-869e-7d9cc34cf15f-kube-api-access-kgvmx\") pod \"c1ac438d-9403-455d-869e-7d9cc34cf15f\" (UID: \"c1ac438d-9403-455d-869e-7d9cc34cf15f\") " Feb 16 19:45:33 crc kubenswrapper[4675]: I0216 19:45:33.891959 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1ac438d-9403-455d-869e-7d9cc34cf15f-catalog-content\") pod \"c1ac438d-9403-455d-869e-7d9cc34cf15f\" (UID: \"c1ac438d-9403-455d-869e-7d9cc34cf15f\") " Feb 16 19:45:33 crc kubenswrapper[4675]: I0216 19:45:33.893171 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1ac438d-9403-455d-869e-7d9cc34cf15f-utilities" (OuterVolumeSpecName: "utilities") pod "c1ac438d-9403-455d-869e-7d9cc34cf15f" (UID: "c1ac438d-9403-455d-869e-7d9cc34cf15f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 19:45:33 crc kubenswrapper[4675]: I0216 19:45:33.893982 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gfhrq" event={"ID":"c1ac438d-9403-455d-869e-7d9cc34cf15f","Type":"ContainerDied","Data":"339df0b6bf520729733b7ad39e03466240548d937c312b5b81de878d085fd8c6"} Feb 16 19:45:33 crc kubenswrapper[4675]: I0216 19:45:33.894046 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gfhrq" event={"ID":"c1ac438d-9403-455d-869e-7d9cc34cf15f","Type":"ContainerDied","Data":"58cebcf6fe579e493c54c2381f509a9ec4673a68ad89651d1637936883489756"} Feb 16 19:45:33 crc kubenswrapper[4675]: I0216 19:45:33.894073 4675 scope.go:117] "RemoveContainer" containerID="339df0b6bf520729733b7ad39e03466240548d937c312b5b81de878d085fd8c6" Feb 16 19:45:33 crc kubenswrapper[4675]: I0216 19:45:33.900102 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1ac438d-9403-455d-869e-7d9cc34cf15f-kube-api-access-kgvmx" (OuterVolumeSpecName: "kube-api-access-kgvmx") pod "c1ac438d-9403-455d-869e-7d9cc34cf15f" (UID: "c1ac438d-9403-455d-869e-7d9cc34cf15f"). InnerVolumeSpecName "kube-api-access-kgvmx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 19:45:33 crc kubenswrapper[4675]: I0216 19:45:33.935863 4675 scope.go:117] "RemoveContainer" containerID="184b8f30ad8b6247137a807a2fd588f707837c803aeb6409ff81125a5a2e3ba5" Feb 16 19:45:33 crc kubenswrapper[4675]: I0216 19:45:33.951603 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1ac438d-9403-455d-869e-7d9cc34cf15f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c1ac438d-9403-455d-869e-7d9cc34cf15f" (UID: "c1ac438d-9403-455d-869e-7d9cc34cf15f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 19:45:33 crc kubenswrapper[4675]: I0216 19:45:33.958835 4675 scope.go:117] "RemoveContainer" containerID="c5dc11cf020461835bf53e389d30c924e97df969eb8e5a66bb03980a2007d100" Feb 16 19:45:33 crc kubenswrapper[4675]: I0216 19:45:33.980843 4675 scope.go:117] "RemoveContainer" containerID="339df0b6bf520729733b7ad39e03466240548d937c312b5b81de878d085fd8c6" Feb 16 19:45:33 crc kubenswrapper[4675]: E0216 19:45:33.981584 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"339df0b6bf520729733b7ad39e03466240548d937c312b5b81de878d085fd8c6\": container with ID starting with 339df0b6bf520729733b7ad39e03466240548d937c312b5b81de878d085fd8c6 not found: ID does not exist" containerID="339df0b6bf520729733b7ad39e03466240548d937c312b5b81de878d085fd8c6" Feb 16 19:45:33 crc kubenswrapper[4675]: I0216 19:45:33.981651 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"339df0b6bf520729733b7ad39e03466240548d937c312b5b81de878d085fd8c6"} err="failed to get container status \"339df0b6bf520729733b7ad39e03466240548d937c312b5b81de878d085fd8c6\": rpc error: code = NotFound desc = could not find container \"339df0b6bf520729733b7ad39e03466240548d937c312b5b81de878d085fd8c6\": container with ID starting with 339df0b6bf520729733b7ad39e03466240548d937c312b5b81de878d085fd8c6 not found: ID does not exist" Feb 16 19:45:33 crc kubenswrapper[4675]: I0216 19:45:33.981724 4675 scope.go:117] "RemoveContainer" containerID="184b8f30ad8b6247137a807a2fd588f707837c803aeb6409ff81125a5a2e3ba5" Feb 16 19:45:33 crc kubenswrapper[4675]: E0216 19:45:33.982599 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"184b8f30ad8b6247137a807a2fd588f707837c803aeb6409ff81125a5a2e3ba5\": container with ID starting with 184b8f30ad8b6247137a807a2fd588f707837c803aeb6409ff81125a5a2e3ba5 not found: ID does not exist" containerID="184b8f30ad8b6247137a807a2fd588f707837c803aeb6409ff81125a5a2e3ba5" Feb 16 19:45:33 crc kubenswrapper[4675]: I0216 19:45:33.982652 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"184b8f30ad8b6247137a807a2fd588f707837c803aeb6409ff81125a5a2e3ba5"} err="failed to get container status \"184b8f30ad8b6247137a807a2fd588f707837c803aeb6409ff81125a5a2e3ba5\": rpc error: code = NotFound desc = could not find container \"184b8f30ad8b6247137a807a2fd588f707837c803aeb6409ff81125a5a2e3ba5\": container with ID starting with 184b8f30ad8b6247137a807a2fd588f707837c803aeb6409ff81125a5a2e3ba5 not found: ID does not exist" Feb 16 19:45:33 crc kubenswrapper[4675]: I0216 19:45:33.982731 4675 scope.go:117] "RemoveContainer" containerID="c5dc11cf020461835bf53e389d30c924e97df969eb8e5a66bb03980a2007d100" Feb 16 19:45:33 crc kubenswrapper[4675]: E0216 19:45:33.983235 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5dc11cf020461835bf53e389d30c924e97df969eb8e5a66bb03980a2007d100\": container with ID starting with c5dc11cf020461835bf53e389d30c924e97df969eb8e5a66bb03980a2007d100 not found: ID does not exist" containerID="c5dc11cf020461835bf53e389d30c924e97df969eb8e5a66bb03980a2007d100" Feb 16 19:45:33 crc kubenswrapper[4675]: I0216 19:45:33.983286 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5dc11cf020461835bf53e389d30c924e97df969eb8e5a66bb03980a2007d100"} err="failed to get container status \"c5dc11cf020461835bf53e389d30c924e97df969eb8e5a66bb03980a2007d100\": rpc error: code = NotFound desc = could not find container \"c5dc11cf020461835bf53e389d30c924e97df969eb8e5a66bb03980a2007d100\": container with ID starting with c5dc11cf020461835bf53e389d30c924e97df969eb8e5a66bb03980a2007d100 not found: ID does not exist" Feb 16 19:45:33 crc kubenswrapper[4675]: I0216 19:45:33.993420 4675 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1ac438d-9403-455d-869e-7d9cc34cf15f-utilities\") on node \"crc\" DevicePath \"\"" Feb 16 19:45:33 crc kubenswrapper[4675]: I0216 19:45:33.993455 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kgvmx\" (UniqueName: \"kubernetes.io/projected/c1ac438d-9403-455d-869e-7d9cc34cf15f-kube-api-access-kgvmx\") on node \"crc\" DevicePath \"\"" Feb 16 19:45:33 crc kubenswrapper[4675]: I0216 19:45:33.993471 4675 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1ac438d-9403-455d-869e-7d9cc34cf15f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 16 19:45:34 crc kubenswrapper[4675]: I0216 19:45:34.220318 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gfhrq"] Feb 16 19:45:34 crc kubenswrapper[4675]: I0216 19:45:34.222974 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-gfhrq"] Feb 16 19:45:35 crc kubenswrapper[4675]: I0216 19:45:35.188170 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-55l96" podUID="b5a72330-c8e5-4be3-8083-ced47a7b6ada" containerName="oauth-openshift" containerID="cri-o://63b74ff70e37b150a2ae3bc386d8433b45f1a4842761386dc46a197d6fb2c137" gracePeriod=15 Feb 16 19:45:35 crc kubenswrapper[4675]: I0216 19:45:35.633852 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-55l96" Feb 16 19:45:35 crc kubenswrapper[4675]: I0216 19:45:35.717604 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4p9pw"] Feb 16 19:45:35 crc kubenswrapper[4675]: I0216 19:45:35.718012 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4p9pw" podUID="dd4b563d-e717-4e75-9a4f-f069626a819a" containerName="registry-server" containerID="cri-o://a1104e844e8344246083bde5b150f6a026ab7fcd728dc72eabebf35c2df26747" gracePeriod=2 Feb 16 19:45:35 crc kubenswrapper[4675]: I0216 19:45:35.822568 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b5a72330-c8e5-4be3-8083-ced47a7b6ada-v4-0-config-system-serving-cert\") pod \"b5a72330-c8e5-4be3-8083-ced47a7b6ada\" (UID: \"b5a72330-c8e5-4be3-8083-ced47a7b6ada\") " Feb 16 19:45:35 crc kubenswrapper[4675]: I0216 19:45:35.822658 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b5a72330-c8e5-4be3-8083-ced47a7b6ada-v4-0-config-system-ocp-branding-template\") pod \"b5a72330-c8e5-4be3-8083-ced47a7b6ada\" (UID: \"b5a72330-c8e5-4be3-8083-ced47a7b6ada\") " Feb 16 19:45:35 crc kubenswrapper[4675]: I0216 19:45:35.822726 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b5a72330-c8e5-4be3-8083-ced47a7b6ada-v4-0-config-user-template-login\") pod \"b5a72330-c8e5-4be3-8083-ced47a7b6ada\" (UID: \"b5a72330-c8e5-4be3-8083-ced47a7b6ada\") " Feb 16 19:45:35 crc kubenswrapper[4675]: I0216 19:45:35.822815 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b5a72330-c8e5-4be3-8083-ced47a7b6ada-audit-policies\") pod \"b5a72330-c8e5-4be3-8083-ced47a7b6ada\" (UID: \"b5a72330-c8e5-4be3-8083-ced47a7b6ada\") " Feb 16 19:45:35 crc kubenswrapper[4675]: I0216 19:45:35.822854 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b5a72330-c8e5-4be3-8083-ced47a7b6ada-v4-0-config-system-router-certs\") pod \"b5a72330-c8e5-4be3-8083-ced47a7b6ada\" (UID: \"b5a72330-c8e5-4be3-8083-ced47a7b6ada\") " Feb 16 19:45:35 crc kubenswrapper[4675]: I0216 19:45:35.822890 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b5a72330-c8e5-4be3-8083-ced47a7b6ada-v4-0-config-system-trusted-ca-bundle\") pod \"b5a72330-c8e5-4be3-8083-ced47a7b6ada\" (UID: \"b5a72330-c8e5-4be3-8083-ced47a7b6ada\") " Feb 16 19:45:35 crc kubenswrapper[4675]: I0216 19:45:35.822930 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b5a72330-c8e5-4be3-8083-ced47a7b6ada-v4-0-config-user-template-provider-selection\") pod \"b5a72330-c8e5-4be3-8083-ced47a7b6ada\" (UID: \"b5a72330-c8e5-4be3-8083-ced47a7b6ada\") " Feb 16 19:45:35 crc kubenswrapper[4675]: I0216 19:45:35.822984 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b5a72330-c8e5-4be3-8083-ced47a7b6ada-v4-0-config-system-cliconfig\") pod \"b5a72330-c8e5-4be3-8083-ced47a7b6ada\" (UID: \"b5a72330-c8e5-4be3-8083-ced47a7b6ada\") " Feb 16 19:45:35 crc kubenswrapper[4675]: I0216 19:45:35.823055 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b5a72330-c8e5-4be3-8083-ced47a7b6ada-v4-0-config-system-session\") pod \"b5a72330-c8e5-4be3-8083-ced47a7b6ada\" (UID: \"b5a72330-c8e5-4be3-8083-ced47a7b6ada\") " Feb 16 19:45:35 crc kubenswrapper[4675]: I0216 19:45:35.823105 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7kkpj\" (UniqueName: \"kubernetes.io/projected/b5a72330-c8e5-4be3-8083-ced47a7b6ada-kube-api-access-7kkpj\") pod \"b5a72330-c8e5-4be3-8083-ced47a7b6ada\" (UID: \"b5a72330-c8e5-4be3-8083-ced47a7b6ada\") " Feb 16 19:45:35 crc kubenswrapper[4675]: I0216 19:45:35.823137 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b5a72330-c8e5-4be3-8083-ced47a7b6ada-audit-dir\") pod \"b5a72330-c8e5-4be3-8083-ced47a7b6ada\" (UID: \"b5a72330-c8e5-4be3-8083-ced47a7b6ada\") " Feb 16 19:45:35 crc kubenswrapper[4675]: I0216 19:45:35.823202 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b5a72330-c8e5-4be3-8083-ced47a7b6ada-v4-0-config-user-idp-0-file-data\") pod \"b5a72330-c8e5-4be3-8083-ced47a7b6ada\" (UID: \"b5a72330-c8e5-4be3-8083-ced47a7b6ada\") " Feb 16 19:45:35 crc kubenswrapper[4675]: I0216 19:45:35.823255 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b5a72330-c8e5-4be3-8083-ced47a7b6ada-v4-0-config-user-template-error\") pod \"b5a72330-c8e5-4be3-8083-ced47a7b6ada\" (UID: \"b5a72330-c8e5-4be3-8083-ced47a7b6ada\") " Feb 16 19:45:35 crc kubenswrapper[4675]: I0216 19:45:35.823298 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b5a72330-c8e5-4be3-8083-ced47a7b6ada-v4-0-config-system-service-ca\") pod \"b5a72330-c8e5-4be3-8083-ced47a7b6ada\" (UID: \"b5a72330-c8e5-4be3-8083-ced47a7b6ada\") " Feb 16 19:45:35 crc kubenswrapper[4675]: I0216 19:45:35.823850 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b5a72330-c8e5-4be3-8083-ced47a7b6ada-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "b5a72330-c8e5-4be3-8083-ced47a7b6ada" (UID: "b5a72330-c8e5-4be3-8083-ced47a7b6ada"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 16 19:45:35 crc kubenswrapper[4675]: I0216 19:45:35.824352 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5a72330-c8e5-4be3-8083-ced47a7b6ada-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "b5a72330-c8e5-4be3-8083-ced47a7b6ada" (UID: "b5a72330-c8e5-4be3-8083-ced47a7b6ada"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 19:45:35 crc kubenswrapper[4675]: I0216 19:45:35.824380 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5a72330-c8e5-4be3-8083-ced47a7b6ada-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "b5a72330-c8e5-4be3-8083-ced47a7b6ada" (UID: "b5a72330-c8e5-4be3-8083-ced47a7b6ada"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 19:45:35 crc kubenswrapper[4675]: I0216 19:45:35.824450 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5a72330-c8e5-4be3-8083-ced47a7b6ada-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "b5a72330-c8e5-4be3-8083-ced47a7b6ada" (UID: "b5a72330-c8e5-4be3-8083-ced47a7b6ada"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 19:45:35 crc kubenswrapper[4675]: I0216 19:45:35.824765 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5a72330-c8e5-4be3-8083-ced47a7b6ada-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "b5a72330-c8e5-4be3-8083-ced47a7b6ada" (UID: "b5a72330-c8e5-4be3-8083-ced47a7b6ada"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 19:45:35 crc kubenswrapper[4675]: I0216 19:45:35.830637 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5a72330-c8e5-4be3-8083-ced47a7b6ada-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "b5a72330-c8e5-4be3-8083-ced47a7b6ada" (UID: "b5a72330-c8e5-4be3-8083-ced47a7b6ada"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 19:45:35 crc kubenswrapper[4675]: I0216 19:45:35.831630 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5a72330-c8e5-4be3-8083-ced47a7b6ada-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "b5a72330-c8e5-4be3-8083-ced47a7b6ada" (UID: "b5a72330-c8e5-4be3-8083-ced47a7b6ada"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 19:45:35 crc kubenswrapper[4675]: I0216 19:45:35.832110 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5a72330-c8e5-4be3-8083-ced47a7b6ada-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "b5a72330-c8e5-4be3-8083-ced47a7b6ada" (UID: "b5a72330-c8e5-4be3-8083-ced47a7b6ada"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 19:45:35 crc kubenswrapper[4675]: I0216 19:45:35.832495 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5a72330-c8e5-4be3-8083-ced47a7b6ada-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "b5a72330-c8e5-4be3-8083-ced47a7b6ada" (UID: "b5a72330-c8e5-4be3-8083-ced47a7b6ada"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 19:45:35 crc kubenswrapper[4675]: I0216 19:45:35.832844 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5a72330-c8e5-4be3-8083-ced47a7b6ada-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "b5a72330-c8e5-4be3-8083-ced47a7b6ada" (UID: "b5a72330-c8e5-4be3-8083-ced47a7b6ada"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 19:45:35 crc kubenswrapper[4675]: I0216 19:45:35.833752 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5a72330-c8e5-4be3-8083-ced47a7b6ada-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "b5a72330-c8e5-4be3-8083-ced47a7b6ada" (UID: "b5a72330-c8e5-4be3-8083-ced47a7b6ada"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 19:45:35 crc kubenswrapper[4675]: I0216 19:45:35.834502 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5a72330-c8e5-4be3-8083-ced47a7b6ada-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "b5a72330-c8e5-4be3-8083-ced47a7b6ada" (UID: "b5a72330-c8e5-4be3-8083-ced47a7b6ada"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 19:45:35 crc kubenswrapper[4675]: I0216 19:45:35.835516 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5a72330-c8e5-4be3-8083-ced47a7b6ada-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "b5a72330-c8e5-4be3-8083-ced47a7b6ada" (UID: "b5a72330-c8e5-4be3-8083-ced47a7b6ada"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 19:45:35 crc kubenswrapper[4675]: I0216 19:45:35.841492 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5a72330-c8e5-4be3-8083-ced47a7b6ada-kube-api-access-7kkpj" (OuterVolumeSpecName: "kube-api-access-7kkpj") pod "b5a72330-c8e5-4be3-8083-ced47a7b6ada" (UID: "b5a72330-c8e5-4be3-8083-ced47a7b6ada"). InnerVolumeSpecName "kube-api-access-7kkpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 19:45:35 crc kubenswrapper[4675]: I0216 19:45:35.893026 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1ac438d-9403-455d-869e-7d9cc34cf15f" path="/var/lib/kubelet/pods/c1ac438d-9403-455d-869e-7d9cc34cf15f/volumes" Feb 16 19:45:35 crc kubenswrapper[4675]: I0216 19:45:35.911090 4675 generic.go:334] "Generic (PLEG): container finished" podID="b5a72330-c8e5-4be3-8083-ced47a7b6ada" containerID="63b74ff70e37b150a2ae3bc386d8433b45f1a4842761386dc46a197d6fb2c137" exitCode=0 Feb 16 19:45:35 crc kubenswrapper[4675]: I0216 19:45:35.911163 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-55l96" event={"ID":"b5a72330-c8e5-4be3-8083-ced47a7b6ada","Type":"ContainerDied","Data":"63b74ff70e37b150a2ae3bc386d8433b45f1a4842761386dc46a197d6fb2c137"} Feb 16 19:45:35 crc kubenswrapper[4675]: I0216 19:45:35.911207 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-55l96" event={"ID":"b5a72330-c8e5-4be3-8083-ced47a7b6ada","Type":"ContainerDied","Data":"2f2effe6edd1814211f494b314a63a0456fd25f540f453ec51b05d8962f43ed5"} Feb 16 19:45:35 crc kubenswrapper[4675]: I0216 19:45:35.911246 4675 scope.go:117] "RemoveContainer" containerID="63b74ff70e37b150a2ae3bc386d8433b45f1a4842761386dc46a197d6fb2c137" Feb 16 19:45:35 crc kubenswrapper[4675]: I0216 19:45:35.911217 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-55l96" Feb 16 19:45:35 crc kubenswrapper[4675]: I0216 19:45:35.926004 4675 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b5a72330-c8e5-4be3-8083-ced47a7b6ada-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 16 19:45:35 crc kubenswrapper[4675]: I0216 19:45:35.926095 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7kkpj\" (UniqueName: \"kubernetes.io/projected/b5a72330-c8e5-4be3-8083-ced47a7b6ada-kube-api-access-7kkpj\") on node \"crc\" DevicePath \"\"" Feb 16 19:45:35 crc kubenswrapper[4675]: I0216 19:45:35.926117 4675 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b5a72330-c8e5-4be3-8083-ced47a7b6ada-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 16 19:45:35 crc kubenswrapper[4675]: I0216 19:45:35.926136 4675 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b5a72330-c8e5-4be3-8083-ced47a7b6ada-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 16 19:45:35 crc kubenswrapper[4675]: I0216 19:45:35.926155 4675 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b5a72330-c8e5-4be3-8083-ced47a7b6ada-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 16 19:45:35 crc kubenswrapper[4675]: I0216 19:45:35.926177 4675 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b5a72330-c8e5-4be3-8083-ced47a7b6ada-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 16 19:45:35 crc kubenswrapper[4675]: I0216 19:45:35.926197 4675 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b5a72330-c8e5-4be3-8083-ced47a7b6ada-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 16 19:45:35 crc kubenswrapper[4675]: I0216 19:45:35.926217 4675 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b5a72330-c8e5-4be3-8083-ced47a7b6ada-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 16 19:45:35 crc kubenswrapper[4675]: I0216 19:45:35.926239 4675 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b5a72330-c8e5-4be3-8083-ced47a7b6ada-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 16 19:45:35 crc kubenswrapper[4675]: I0216 19:45:35.926260 4675 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b5a72330-c8e5-4be3-8083-ced47a7b6ada-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 16 19:45:35 crc kubenswrapper[4675]: I0216 19:45:35.926281 4675 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b5a72330-c8e5-4be3-8083-ced47a7b6ada-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 16 19:45:35 crc kubenswrapper[4675]: I0216 19:45:35.926340 4675 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b5a72330-c8e5-4be3-8083-ced47a7b6ada-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 16 19:45:35 crc kubenswrapper[4675]: I0216 19:45:35.926362 4675 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b5a72330-c8e5-4be3-8083-ced47a7b6ada-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 16 19:45:35 crc kubenswrapper[4675]: I0216 19:45:35.926421 4675 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b5a72330-c8e5-4be3-8083-ced47a7b6ada-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 16 19:45:35 crc kubenswrapper[4675]: I0216 19:45:35.945750 4675 scope.go:117] "RemoveContainer" containerID="63b74ff70e37b150a2ae3bc386d8433b45f1a4842761386dc46a197d6fb2c137" Feb 16 19:45:35 crc kubenswrapper[4675]: E0216 19:45:35.946426 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63b74ff70e37b150a2ae3bc386d8433b45f1a4842761386dc46a197d6fb2c137\": container with ID starting with 63b74ff70e37b150a2ae3bc386d8433b45f1a4842761386dc46a197d6fb2c137 not found: ID does not exist" containerID="63b74ff70e37b150a2ae3bc386d8433b45f1a4842761386dc46a197d6fb2c137" Feb 16 19:45:35 crc kubenswrapper[4675]: I0216 19:45:35.946474 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63b74ff70e37b150a2ae3bc386d8433b45f1a4842761386dc46a197d6fb2c137"} err="failed to get container status \"63b74ff70e37b150a2ae3bc386d8433b45f1a4842761386dc46a197d6fb2c137\": rpc error: code = NotFound desc = could not find container \"63b74ff70e37b150a2ae3bc386d8433b45f1a4842761386dc46a197d6fb2c137\": container with ID starting with 63b74ff70e37b150a2ae3bc386d8433b45f1a4842761386dc46a197d6fb2c137 not found: ID does not exist" Feb 16 19:45:35 crc kubenswrapper[4675]: I0216 19:45:35.948899 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-55l96"] Feb 16 19:45:35 crc kubenswrapper[4675]: I0216 19:45:35.954372 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-55l96"] Feb 16 19:45:36 crc kubenswrapper[4675]: I0216 19:45:36.499346 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4p9pw" Feb 16 19:45:36 crc kubenswrapper[4675]: I0216 19:45:36.617073 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-7448d7568b-5hck8"] Feb 16 19:45:36 crc kubenswrapper[4675]: E0216 19:45:36.617289 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5a72330-c8e5-4be3-8083-ced47a7b6ada" containerName="oauth-openshift" Feb 16 19:45:36 crc kubenswrapper[4675]: I0216 19:45:36.617302 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5a72330-c8e5-4be3-8083-ced47a7b6ada" containerName="oauth-openshift" Feb 16 19:45:36 crc kubenswrapper[4675]: E0216 19:45:36.617310 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd4b563d-e717-4e75-9a4f-f069626a819a" containerName="extract-utilities" Feb 16 19:45:36 crc kubenswrapper[4675]: I0216 19:45:36.617316 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd4b563d-e717-4e75-9a4f-f069626a819a" containerName="extract-utilities" Feb 16 19:45:36 crc kubenswrapper[4675]: E0216 19:45:36.617325 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd4b563d-e717-4e75-9a4f-f069626a819a" containerName="extract-content" Feb 16 19:45:36 crc kubenswrapper[4675]: I0216 19:45:36.617331 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd4b563d-e717-4e75-9a4f-f069626a819a" containerName="extract-content" Feb 16 19:45:36 crc kubenswrapper[4675]: E0216 19:45:36.617341 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1ac438d-9403-455d-869e-7d9cc34cf15f" containerName="extract-content" Feb 16 19:45:36 crc kubenswrapper[4675]: I0216 19:45:36.617347 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1ac438d-9403-455d-869e-7d9cc34cf15f" containerName="extract-content" Feb 16 19:45:36 crc kubenswrapper[4675]: E0216 19:45:36.617361 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd4b563d-e717-4e75-9a4f-f069626a819a" containerName="registry-server" Feb 16 19:45:36 crc kubenswrapper[4675]: I0216 19:45:36.617369 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd4b563d-e717-4e75-9a4f-f069626a819a" containerName="registry-server" Feb 16 19:45:36 crc kubenswrapper[4675]: E0216 19:45:36.617384 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1ac438d-9403-455d-869e-7d9cc34cf15f" containerName="registry-server" Feb 16 19:45:36 crc kubenswrapper[4675]: I0216 19:45:36.617389 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1ac438d-9403-455d-869e-7d9cc34cf15f" containerName="registry-server" Feb 16 19:45:36 crc kubenswrapper[4675]: E0216 19:45:36.617397 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1ac438d-9403-455d-869e-7d9cc34cf15f" containerName="extract-utilities" Feb 16 19:45:36 crc kubenswrapper[4675]: I0216 19:45:36.617402 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1ac438d-9403-455d-869e-7d9cc34cf15f" containerName="extract-utilities" Feb 16 19:45:36 crc kubenswrapper[4675]: I0216 19:45:36.617484 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5a72330-c8e5-4be3-8083-ced47a7b6ada" containerName="oauth-openshift" Feb 16 19:45:36 crc kubenswrapper[4675]: I0216 19:45:36.617498 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd4b563d-e717-4e75-9a4f-f069626a819a" containerName="registry-server" Feb 16 19:45:36 crc kubenswrapper[4675]: I0216 19:45:36.617509 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1ac438d-9403-455d-869e-7d9cc34cf15f" containerName="registry-server" Feb 16 19:45:36 crc kubenswrapper[4675]: I0216 19:45:36.619208 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7448d7568b-5hck8" Feb 16 19:45:36 crc kubenswrapper[4675]: I0216 19:45:36.628399 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 16 19:45:36 crc kubenswrapper[4675]: I0216 19:45:36.628470 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 16 19:45:36 crc kubenswrapper[4675]: I0216 19:45:36.628544 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 16 19:45:36 crc kubenswrapper[4675]: I0216 19:45:36.628888 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 16 19:45:36 crc kubenswrapper[4675]: I0216 19:45:36.629620 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 16 19:45:36 crc kubenswrapper[4675]: I0216 19:45:36.629674 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 16 19:45:36 crc kubenswrapper[4675]: I0216 19:45:36.629749 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 16 19:45:36 crc kubenswrapper[4675]: I0216 19:45:36.636219 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 16 19:45:36 crc kubenswrapper[4675]: I0216 19:45:36.637114 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 16 19:45:36 crc kubenswrapper[4675]: I0216 19:45:36.639307 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 16 19:45:36 crc kubenswrapper[4675]: I0216 19:45:36.639835 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 16 19:45:36 crc kubenswrapper[4675]: I0216 19:45:36.640195 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 16 19:45:36 crc kubenswrapper[4675]: I0216 19:45:36.640644 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sd6tl\" (UniqueName: \"kubernetes.io/projected/dd4b563d-e717-4e75-9a4f-f069626a819a-kube-api-access-sd6tl\") pod \"dd4b563d-e717-4e75-9a4f-f069626a819a\" (UID: \"dd4b563d-e717-4e75-9a4f-f069626a819a\") " Feb 16 19:45:36 crc kubenswrapper[4675]: I0216 19:45:36.640783 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd4b563d-e717-4e75-9a4f-f069626a819a-catalog-content\") pod \"dd4b563d-e717-4e75-9a4f-f069626a819a\" (UID: \"dd4b563d-e717-4e75-9a4f-f069626a819a\") " Feb 16 19:45:36 crc kubenswrapper[4675]: I0216 19:45:36.640829 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd4b563d-e717-4e75-9a4f-f069626a819a-utilities\") pod \"dd4b563d-e717-4e75-9a4f-f069626a819a\" (UID: \"dd4b563d-e717-4e75-9a4f-f069626a819a\") " Feb 16 19:45:36 crc kubenswrapper[4675]: I0216 19:45:36.644282 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd4b563d-e717-4e75-9a4f-f069626a819a-utilities" (OuterVolumeSpecName: "utilities") pod "dd4b563d-e717-4e75-9a4f-f069626a819a" (UID: "dd4b563d-e717-4e75-9a4f-f069626a819a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 19:45:36 crc kubenswrapper[4675]: I0216 19:45:36.652602 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 16 19:45:36 crc kubenswrapper[4675]: I0216 19:45:36.653674 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 16 19:45:36 crc kubenswrapper[4675]: I0216 19:45:36.663115 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd4b563d-e717-4e75-9a4f-f069626a819a-kube-api-access-sd6tl" (OuterVolumeSpecName: "kube-api-access-sd6tl") pod "dd4b563d-e717-4e75-9a4f-f069626a819a" (UID: "dd4b563d-e717-4e75-9a4f-f069626a819a"). InnerVolumeSpecName "kube-api-access-sd6tl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 19:45:36 crc kubenswrapper[4675]: I0216 19:45:36.664241 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7448d7568b-5hck8"] Feb 16 19:45:36 crc kubenswrapper[4675]: I0216 19:45:36.668253 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd4b563d-e717-4e75-9a4f-f069626a819a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dd4b563d-e717-4e75-9a4f-f069626a819a" (UID: "dd4b563d-e717-4e75-9a4f-f069626a819a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 19:45:36 crc kubenswrapper[4675]: I0216 19:45:36.673958 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 16 19:45:36 crc kubenswrapper[4675]: I0216 19:45:36.743172 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e842339d-5874-45ac-9b15-4debc8083be5-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7448d7568b-5hck8\" (UID: \"e842339d-5874-45ac-9b15-4debc8083be5\") " pod="openshift-authentication/oauth-openshift-7448d7568b-5hck8" Feb 16 19:45:36 crc kubenswrapper[4675]: I0216 19:45:36.743506 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e842339d-5874-45ac-9b15-4debc8083be5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7448d7568b-5hck8\" (UID: \"e842339d-5874-45ac-9b15-4debc8083be5\") " pod="openshift-authentication/oauth-openshift-7448d7568b-5hck8" Feb 16 19:45:36 crc kubenswrapper[4675]: I0216 19:45:36.743613 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbcv7\" (UniqueName: \"kubernetes.io/projected/e842339d-5874-45ac-9b15-4debc8083be5-kube-api-access-vbcv7\") pod \"oauth-openshift-7448d7568b-5hck8\" (UID: \"e842339d-5874-45ac-9b15-4debc8083be5\") " pod="openshift-authentication/oauth-openshift-7448d7568b-5hck8" Feb 16 19:45:36 crc kubenswrapper[4675]: I0216 19:45:36.743808 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e842339d-5874-45ac-9b15-4debc8083be5-v4-0-config-system-session\") pod \"oauth-openshift-7448d7568b-5hck8\" (UID: \"e842339d-5874-45ac-9b15-4debc8083be5\") " pod="openshift-authentication/oauth-openshift-7448d7568b-5hck8" Feb 16 19:45:36 crc kubenswrapper[4675]: I0216 19:45:36.743899 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e842339d-5874-45ac-9b15-4debc8083be5-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7448d7568b-5hck8\" (UID: \"e842339d-5874-45ac-9b15-4debc8083be5\") " pod="openshift-authentication/oauth-openshift-7448d7568b-5hck8" Feb 16 19:45:36 crc kubenswrapper[4675]: I0216 19:45:36.743986 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e842339d-5874-45ac-9b15-4debc8083be5-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7448d7568b-5hck8\" (UID: \"e842339d-5874-45ac-9b15-4debc8083be5\") " pod="openshift-authentication/oauth-openshift-7448d7568b-5hck8" Feb 16 19:45:36 crc kubenswrapper[4675]: I0216 19:45:36.744128 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e842339d-5874-45ac-9b15-4debc8083be5-v4-0-config-system-service-ca\") pod \"oauth-openshift-7448d7568b-5hck8\" (UID: \"e842339d-5874-45ac-9b15-4debc8083be5\") " pod="openshift-authentication/oauth-openshift-7448d7568b-5hck8" Feb 16 19:45:36 crc kubenswrapper[4675]: I0216 19:45:36.744612 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e842339d-5874-45ac-9b15-4debc8083be5-audit-policies\") pod \"oauth-openshift-7448d7568b-5hck8\" (UID: \"e842339d-5874-45ac-9b15-4debc8083be5\") " pod="openshift-authentication/oauth-openshift-7448d7568b-5hck8" Feb 16 19:45:36 crc kubenswrapper[4675]: I0216 19:45:36.744738 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e842339d-5874-45ac-9b15-4debc8083be5-v4-0-config-user-template-error\") pod \"oauth-openshift-7448d7568b-5hck8\" (UID: \"e842339d-5874-45ac-9b15-4debc8083be5\") " pod="openshift-authentication/oauth-openshift-7448d7568b-5hck8" Feb 16 19:45:36 crc kubenswrapper[4675]: I0216 19:45:36.744877 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e842339d-5874-45ac-9b15-4debc8083be5-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7448d7568b-5hck8\" (UID: \"e842339d-5874-45ac-9b15-4debc8083be5\") " pod="openshift-authentication/oauth-openshift-7448d7568b-5hck8" Feb 16 19:45:36 crc kubenswrapper[4675]: I0216 19:45:36.744986 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e842339d-5874-45ac-9b15-4debc8083be5-audit-dir\") pod \"oauth-openshift-7448d7568b-5hck8\" (UID: \"e842339d-5874-45ac-9b15-4debc8083be5\") " pod="openshift-authentication/oauth-openshift-7448d7568b-5hck8" Feb 16 19:45:36 crc kubenswrapper[4675]: I0216 19:45:36.745103 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e842339d-5874-45ac-9b15-4debc8083be5-v4-0-config-system-router-certs\") pod \"oauth-openshift-7448d7568b-5hck8\" (UID: \"e842339d-5874-45ac-9b15-4debc8083be5\") " pod="openshift-authentication/oauth-openshift-7448d7568b-5hck8" Feb 16 19:45:36 crc kubenswrapper[4675]: I0216 19:45:36.745195 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e842339d-5874-45ac-9b15-4debc8083be5-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7448d7568b-5hck8\" (UID: \"e842339d-5874-45ac-9b15-4debc8083be5\") " pod="openshift-authentication/oauth-openshift-7448d7568b-5hck8" Feb 16 19:45:36 crc kubenswrapper[4675]: I0216 19:45:36.745277 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e842339d-5874-45ac-9b15-4debc8083be5-v4-0-config-user-template-login\") pod \"oauth-openshift-7448d7568b-5hck8\" (UID: \"e842339d-5874-45ac-9b15-4debc8083be5\") " pod="openshift-authentication/oauth-openshift-7448d7568b-5hck8" Feb 16 19:45:36 crc kubenswrapper[4675]: I0216 19:45:36.745394 4675 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd4b563d-e717-4e75-9a4f-f069626a819a-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 16 19:45:36 crc kubenswrapper[4675]: I0216 19:45:36.745466 4675 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd4b563d-e717-4e75-9a4f-f069626a819a-utilities\") on node \"crc\" DevicePath \"\"" Feb 16 19:45:36 crc kubenswrapper[4675]: I0216 19:45:36.745527 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sd6tl\" (UniqueName: \"kubernetes.io/projected/dd4b563d-e717-4e75-9a4f-f069626a819a-kube-api-access-sd6tl\") on node \"crc\" DevicePath \"\"" Feb 16 19:45:36 crc kubenswrapper[4675]: I0216 19:45:36.846712 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e842339d-5874-45ac-9b15-4debc8083be5-v4-0-config-system-router-certs\") pod \"oauth-openshift-7448d7568b-5hck8\" (UID: \"e842339d-5874-45ac-9b15-4debc8083be5\") " pod="openshift-authentication/oauth-openshift-7448d7568b-5hck8" Feb 16 19:45:36 crc kubenswrapper[4675]: I0216 19:45:36.847194 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e842339d-5874-45ac-9b15-4debc8083be5-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7448d7568b-5hck8\" (UID: \"e842339d-5874-45ac-9b15-4debc8083be5\") " pod="openshift-authentication/oauth-openshift-7448d7568b-5hck8" Feb 16 19:45:36 crc kubenswrapper[4675]: I0216 19:45:36.847572 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e842339d-5874-45ac-9b15-4debc8083be5-v4-0-config-user-template-login\") pod \"oauth-openshift-7448d7568b-5hck8\" (UID: \"e842339d-5874-45ac-9b15-4debc8083be5\") " pod="openshift-authentication/oauth-openshift-7448d7568b-5hck8" Feb 16 19:45:36 crc kubenswrapper[4675]: I0216 19:45:36.847921 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e842339d-5874-45ac-9b15-4debc8083be5-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7448d7568b-5hck8\" (UID: \"e842339d-5874-45ac-9b15-4debc8083be5\") " pod="openshift-authentication/oauth-openshift-7448d7568b-5hck8" Feb 16 19:45:36 crc kubenswrapper[4675]: I0216 19:45:36.848210 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e842339d-5874-45ac-9b15-4debc8083be5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7448d7568b-5hck8\" (UID: \"e842339d-5874-45ac-9b15-4debc8083be5\") " pod="openshift-authentication/oauth-openshift-7448d7568b-5hck8" Feb 16 19:45:36 crc kubenswrapper[4675]: I0216 19:45:36.848446 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbcv7\" (UniqueName: \"kubernetes.io/projected/e842339d-5874-45ac-9b15-4debc8083be5-kube-api-access-vbcv7\") pod \"oauth-openshift-7448d7568b-5hck8\" (UID: \"e842339d-5874-45ac-9b15-4debc8083be5\") " pod="openshift-authentication/oauth-openshift-7448d7568b-5hck8" Feb 16 19:45:36 crc kubenswrapper[4675]: I0216 19:45:36.848734 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e842339d-5874-45ac-9b15-4debc8083be5-v4-0-config-system-session\") pod \"oauth-openshift-7448d7568b-5hck8\" (UID: \"e842339d-5874-45ac-9b15-4debc8083be5\") " pod="openshift-authentication/oauth-openshift-7448d7568b-5hck8" Feb 16 19:45:36 crc kubenswrapper[4675]: I0216 19:45:36.849025 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e842339d-5874-45ac-9b15-4debc8083be5-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7448d7568b-5hck8\" (UID: \"e842339d-5874-45ac-9b15-4debc8083be5\") " pod="openshift-authentication/oauth-openshift-7448d7568b-5hck8" Feb 16 19:45:36 crc kubenswrapper[4675]: I0216 19:45:36.849251 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e842339d-5874-45ac-9b15-4debc8083be5-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7448d7568b-5hck8\" (UID: \"e842339d-5874-45ac-9b15-4debc8083be5\") " pod="openshift-authentication/oauth-openshift-7448d7568b-5hck8" Feb 16 19:45:36 crc kubenswrapper[4675]: I0216 19:45:36.849483 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e842339d-5874-45ac-9b15-4debc8083be5-v4-0-config-system-service-ca\") pod \"oauth-openshift-7448d7568b-5hck8\" (UID: \"e842339d-5874-45ac-9b15-4debc8083be5\") " pod="openshift-authentication/oauth-openshift-7448d7568b-5hck8" Feb 16 19:45:36 crc kubenswrapper[4675]: I0216 19:45:36.849769 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e842339d-5874-45ac-9b15-4debc8083be5-audit-policies\") pod \"oauth-openshift-7448d7568b-5hck8\" (UID: \"e842339d-5874-45ac-9b15-4debc8083be5\") " pod="openshift-authentication/oauth-openshift-7448d7568b-5hck8" Feb 16 19:45:36 crc kubenswrapper[4675]: I0216 19:45:36.848838 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e842339d-5874-45ac-9b15-4debc8083be5-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7448d7568b-5hck8\" (UID: \"e842339d-5874-45ac-9b15-4debc8083be5\") " pod="openshift-authentication/oauth-openshift-7448d7568b-5hck8" Feb 16 19:45:36 crc kubenswrapper[4675]: I0216 19:45:36.849299 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e842339d-5874-45ac-9b15-4debc8083be5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7448d7568b-5hck8\" (UID: \"e842339d-5874-45ac-9b15-4debc8083be5\") " pod="openshift-authentication/oauth-openshift-7448d7568b-5hck8" Feb 16 19:45:36 crc kubenswrapper[4675]: I0216 19:45:36.850279 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e842339d-5874-45ac-9b15-4debc8083be5-v4-0-config-user-template-error\") pod \"oauth-openshift-7448d7568b-5hck8\" (UID: \"e842339d-5874-45ac-9b15-4debc8083be5\") " pod="openshift-authentication/oauth-openshift-7448d7568b-5hck8" Feb 16 19:45:36 crc kubenswrapper[4675]: I0216 19:45:36.850540 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e842339d-5874-45ac-9b15-4debc8083be5-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7448d7568b-5hck8\" (UID: \"e842339d-5874-45ac-9b15-4debc8083be5\") " pod="openshift-authentication/oauth-openshift-7448d7568b-5hck8" Feb 16 19:45:36 crc kubenswrapper[4675]: I0216 19:45:36.850843 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e842339d-5874-45ac-9b15-4debc8083be5-audit-dir\") pod \"oauth-openshift-7448d7568b-5hck8\" (UID: \"e842339d-5874-45ac-9b15-4debc8083be5\") " pod="openshift-authentication/oauth-openshift-7448d7568b-5hck8" Feb 16 19:45:36 crc kubenswrapper[4675]: I0216 19:45:36.851051 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e842339d-5874-45ac-9b15-4debc8083be5-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7448d7568b-5hck8\" (UID: \"e842339d-5874-45ac-9b15-4debc8083be5\") " pod="openshift-authentication/oauth-openshift-7448d7568b-5hck8" Feb 16 19:45:36 crc kubenswrapper[4675]: I0216 19:45:36.851104 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e842339d-5874-45ac-9b15-4debc8083be5-v4-0-config-system-service-ca\") pod \"oauth-openshift-7448d7568b-5hck8\" (UID: \"e842339d-5874-45ac-9b15-4debc8083be5\") " pod="openshift-authentication/oauth-openshift-7448d7568b-5hck8" Feb 16 19:45:36 crc kubenswrapper[4675]: I0216 19:45:36.851011 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e842339d-5874-45ac-9b15-4debc8083be5-audit-dir\") pod \"oauth-openshift-7448d7568b-5hck8\" (UID: \"e842339d-5874-45ac-9b15-4debc8083be5\") " pod="openshift-authentication/oauth-openshift-7448d7568b-5hck8" Feb 16 19:45:36 crc kubenswrapper[4675]: I0216 19:45:36.851282 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e842339d-5874-45ac-9b15-4debc8083be5-audit-policies\") pod \"oauth-openshift-7448d7568b-5hck8\" (UID: \"e842339d-5874-45ac-9b15-4debc8083be5\") " pod="openshift-authentication/oauth-openshift-7448d7568b-5hck8" Feb 16 19:45:36 crc kubenswrapper[4675]: I0216 19:45:36.852345 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e842339d-5874-45ac-9b15-4debc8083be5-v4-0-config-system-router-certs\") pod \"oauth-openshift-7448d7568b-5hck8\" (UID: \"e842339d-5874-45ac-9b15-4debc8083be5\") " pod="openshift-authentication/oauth-openshift-7448d7568b-5hck8" Feb 16 19:45:36 crc kubenswrapper[4675]: I0216 19:45:36.853465 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e842339d-5874-45ac-9b15-4debc8083be5-v4-0-config-user-template-login\") pod \"oauth-openshift-7448d7568b-5hck8\" (UID: \"e842339d-5874-45ac-9b15-4debc8083be5\") " pod="openshift-authentication/oauth-openshift-7448d7568b-5hck8" Feb 16 19:45:36 crc kubenswrapper[4675]: I0216 19:45:36.854407 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e842339d-5874-45ac-9b15-4debc8083be5-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7448d7568b-5hck8\" (UID: \"e842339d-5874-45ac-9b15-4debc8083be5\") " pod="openshift-authentication/oauth-openshift-7448d7568b-5hck8" Feb 16 19:45:36 crc kubenswrapper[4675]: I0216 19:45:36.854993 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e842339d-5874-45ac-9b15-4debc8083be5-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7448d7568b-5hck8\" (UID: \"e842339d-5874-45ac-9b15-4debc8083be5\") " pod="openshift-authentication/oauth-openshift-7448d7568b-5hck8" Feb 16 19:45:36 crc kubenswrapper[4675]: I0216 19:45:36.855259 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e842339d-5874-45ac-9b15-4debc8083be5-v4-0-config-user-template-error\") pod \"oauth-openshift-7448d7568b-5hck8\" (UID: \"e842339d-5874-45ac-9b15-4debc8083be5\") " pod="openshift-authentication/oauth-openshift-7448d7568b-5hck8" Feb 16 19:45:36 crc kubenswrapper[4675]: I0216 19:45:36.856008 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e842339d-5874-45ac-9b15-4debc8083be5-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7448d7568b-5hck8\" (UID: \"e842339d-5874-45ac-9b15-4debc8083be5\") " pod="openshift-authentication/oauth-openshift-7448d7568b-5hck8" Feb 16 19:45:36 crc kubenswrapper[4675]: I0216 19:45:36.857764 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e842339d-5874-45ac-9b15-4debc8083be5-v4-0-config-system-session\") pod \"oauth-openshift-7448d7568b-5hck8\" (UID: \"e842339d-5874-45ac-9b15-4debc8083be5\") " pod="openshift-authentication/oauth-openshift-7448d7568b-5hck8" Feb 16 19:45:36 crc kubenswrapper[4675]: I0216 19:45:36.867290 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbcv7\" (UniqueName: \"kubernetes.io/projected/e842339d-5874-45ac-9b15-4debc8083be5-kube-api-access-vbcv7\") pod \"oauth-openshift-7448d7568b-5hck8\" (UID: \"e842339d-5874-45ac-9b15-4debc8083be5\") " pod="openshift-authentication/oauth-openshift-7448d7568b-5hck8" Feb 16 19:45:36 crc kubenswrapper[4675]: I0216 19:45:36.920293 4675 generic.go:334] "Generic (PLEG): container finished" podID="dd4b563d-e717-4e75-9a4f-f069626a819a" containerID="a1104e844e8344246083bde5b150f6a026ab7fcd728dc72eabebf35c2df26747" exitCode=0 Feb 16 19:45:36 crc kubenswrapper[4675]: I0216 19:45:36.920345 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4p9pw" event={"ID":"dd4b563d-e717-4e75-9a4f-f069626a819a","Type":"ContainerDied","Data":"a1104e844e8344246083bde5b150f6a026ab7fcd728dc72eabebf35c2df26747"} Feb 16 19:45:36 crc kubenswrapper[4675]: I0216 19:45:36.920428 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4p9pw" Feb 16 19:45:36 crc kubenswrapper[4675]: I0216 19:45:36.920957 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4p9pw" event={"ID":"dd4b563d-e717-4e75-9a4f-f069626a819a","Type":"ContainerDied","Data":"f2156441ab0e48dc3c1bf34f96489a0bbeae30fcd2aa2e9ae819d5cc6e2158e4"} Feb 16 19:45:36 crc kubenswrapper[4675]: I0216 19:45:36.920997 4675 scope.go:117] "RemoveContainer" containerID="a1104e844e8344246083bde5b150f6a026ab7fcd728dc72eabebf35c2df26747" Feb 16 19:45:36 crc kubenswrapper[4675]: I0216 19:45:36.951968 4675 scope.go:117] "RemoveContainer" containerID="9e1b8b847061a07134ec15d7fce2076205481b89fc8ba62c42c02d0dce98c97e" Feb 16 19:45:36 crc kubenswrapper[4675]: I0216 19:45:36.961017 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4p9pw"] Feb 16 19:45:36 crc kubenswrapper[4675]: I0216 19:45:36.968025 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4p9pw"] Feb 16 19:45:36 crc kubenswrapper[4675]: I0216 19:45:36.979989 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7448d7568b-5hck8" Feb 16 19:45:36 crc kubenswrapper[4675]: I0216 19:45:36.989344 4675 scope.go:117] "RemoveContainer" containerID="4f5061c85516b54e8ac021280919bb4299d7ebc5bf6cba3f339b865f7a89d880" Feb 16 19:45:37 crc kubenswrapper[4675]: I0216 19:45:37.014299 4675 scope.go:117] "RemoveContainer" containerID="a1104e844e8344246083bde5b150f6a026ab7fcd728dc72eabebf35c2df26747" Feb 16 19:45:37 crc kubenswrapper[4675]: E0216 19:45:37.015298 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1104e844e8344246083bde5b150f6a026ab7fcd728dc72eabebf35c2df26747\": container with ID starting with a1104e844e8344246083bde5b150f6a026ab7fcd728dc72eabebf35c2df26747 not found: ID does not exist" containerID="a1104e844e8344246083bde5b150f6a026ab7fcd728dc72eabebf35c2df26747" Feb 16 19:45:37 crc kubenswrapper[4675]: I0216 19:45:37.015434 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1104e844e8344246083bde5b150f6a026ab7fcd728dc72eabebf35c2df26747"} err="failed to get container status \"a1104e844e8344246083bde5b150f6a026ab7fcd728dc72eabebf35c2df26747\": rpc error: code = NotFound desc = could not find container \"a1104e844e8344246083bde5b150f6a026ab7fcd728dc72eabebf35c2df26747\": container with ID starting with a1104e844e8344246083bde5b150f6a026ab7fcd728dc72eabebf35c2df26747 not found: ID does not exist" Feb 16 19:45:37 crc kubenswrapper[4675]: I0216 19:45:37.015481 4675 scope.go:117] "RemoveContainer" containerID="9e1b8b847061a07134ec15d7fce2076205481b89fc8ba62c42c02d0dce98c97e" Feb 16 19:45:37 crc kubenswrapper[4675]: E0216 19:45:37.016187 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e1b8b847061a07134ec15d7fce2076205481b89fc8ba62c42c02d0dce98c97e\": container with ID starting with 9e1b8b847061a07134ec15d7fce2076205481b89fc8ba62c42c02d0dce98c97e not found: ID does not exist" containerID="9e1b8b847061a07134ec15d7fce2076205481b89fc8ba62c42c02d0dce98c97e" Feb 16 19:45:37 crc kubenswrapper[4675]: I0216 19:45:37.016248 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e1b8b847061a07134ec15d7fce2076205481b89fc8ba62c42c02d0dce98c97e"} err="failed to get container status \"9e1b8b847061a07134ec15d7fce2076205481b89fc8ba62c42c02d0dce98c97e\": rpc error: code = NotFound desc = could not find container \"9e1b8b847061a07134ec15d7fce2076205481b89fc8ba62c42c02d0dce98c97e\": container with ID starting with 9e1b8b847061a07134ec15d7fce2076205481b89fc8ba62c42c02d0dce98c97e not found: ID does not exist" Feb 16 19:45:37 crc kubenswrapper[4675]: I0216 19:45:37.016352 4675 scope.go:117] "RemoveContainer" containerID="4f5061c85516b54e8ac021280919bb4299d7ebc5bf6cba3f339b865f7a89d880" Feb 16 19:45:37 crc kubenswrapper[4675]: E0216 19:45:37.017938 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f5061c85516b54e8ac021280919bb4299d7ebc5bf6cba3f339b865f7a89d880\": container with ID starting with 4f5061c85516b54e8ac021280919bb4299d7ebc5bf6cba3f339b865f7a89d880 not found: ID does not exist" containerID="4f5061c85516b54e8ac021280919bb4299d7ebc5bf6cba3f339b865f7a89d880" Feb 16 19:45:37 crc kubenswrapper[4675]: I0216 19:45:37.017978 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f5061c85516b54e8ac021280919bb4299d7ebc5bf6cba3f339b865f7a89d880"} err="failed to get container status \"4f5061c85516b54e8ac021280919bb4299d7ebc5bf6cba3f339b865f7a89d880\": rpc error: code = NotFound desc = could not find container \"4f5061c85516b54e8ac021280919bb4299d7ebc5bf6cba3f339b865f7a89d880\": container with ID starting with 4f5061c85516b54e8ac021280919bb4299d7ebc5bf6cba3f339b865f7a89d880 not found: ID does not exist" Feb 16 19:45:37 crc kubenswrapper[4675]: I0216 19:45:37.124677 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nqmcd"] Feb 16 19:45:37 crc kubenswrapper[4675]: I0216 19:45:37.125032 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-nqmcd" podUID="96436f43-1dfe-4bba-98d5-ee0f45f78415" containerName="registry-server" containerID="cri-o://0b3c8c43fb4f6ed25f069d5090f2a09e29cd1b2fb61055470c6d56117dc1560e" gracePeriod=2 Feb 16 19:45:37 crc kubenswrapper[4675]: I0216 19:45:37.220975 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7448d7568b-5hck8"] Feb 16 19:45:37 crc kubenswrapper[4675]: W0216 19:45:37.229086 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode842339d_5874_45ac_9b15_4debc8083be5.slice/crio-50d1187188e623727183175c84ddeb66f2b781de005359813581b0ab3bc28462 WatchSource:0}: Error finding container 50d1187188e623727183175c84ddeb66f2b781de005359813581b0ab3bc28462: Status 404 returned error can't find the container with id 50d1187188e623727183175c84ddeb66f2b781de005359813581b0ab3bc28462 Feb 16 19:45:37 crc kubenswrapper[4675]: I0216 19:45:37.893131 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5a72330-c8e5-4be3-8083-ced47a7b6ada" path="/var/lib/kubelet/pods/b5a72330-c8e5-4be3-8083-ced47a7b6ada/volumes" Feb 16 19:45:37 crc kubenswrapper[4675]: I0216 19:45:37.895092 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd4b563d-e717-4e75-9a4f-f069626a819a" path="/var/lib/kubelet/pods/dd4b563d-e717-4e75-9a4f-f069626a819a/volumes" Feb 16 19:45:37 crc kubenswrapper[4675]: I0216 19:45:37.936590 4675 generic.go:334] "Generic (PLEG): container finished" podID="96436f43-1dfe-4bba-98d5-ee0f45f78415" containerID="0b3c8c43fb4f6ed25f069d5090f2a09e29cd1b2fb61055470c6d56117dc1560e" exitCode=0 Feb 16 19:45:37 crc kubenswrapper[4675]: I0216 19:45:37.936668 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nqmcd" event={"ID":"96436f43-1dfe-4bba-98d5-ee0f45f78415","Type":"ContainerDied","Data":"0b3c8c43fb4f6ed25f069d5090f2a09e29cd1b2fb61055470c6d56117dc1560e"} Feb 16 19:45:37 crc kubenswrapper[4675]: I0216 19:45:37.940596 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7448d7568b-5hck8" event={"ID":"e842339d-5874-45ac-9b15-4debc8083be5","Type":"ContainerStarted","Data":"00a5b1fc67c7ff602b85159dfd2d0ebdb7d738be37d3aaafa97caf3f9e5fc0c2"} Feb 16 19:45:37 crc kubenswrapper[4675]: I0216 19:45:37.940643 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7448d7568b-5hck8" event={"ID":"e842339d-5874-45ac-9b15-4debc8083be5","Type":"ContainerStarted","Data":"50d1187188e623727183175c84ddeb66f2b781de005359813581b0ab3bc28462"} Feb 16 19:45:37 crc kubenswrapper[4675]: I0216 19:45:37.941933 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-7448d7568b-5hck8" Feb 16 19:45:37 crc kubenswrapper[4675]: I0216 19:45:37.946804 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-7448d7568b-5hck8" Feb 16 19:45:37 crc kubenswrapper[4675]: I0216 19:45:37.971594 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-7448d7568b-5hck8" podStartSLOduration=27.971571563 podStartE2EDuration="27.971571563s" podCreationTimestamp="2026-02-16 19:45:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 19:45:37.9687571 +0000 UTC m=+221.094046686" watchObservedRunningTime="2026-02-16 19:45:37.971571563 +0000 UTC m=+221.096861109" Feb 16 19:45:38 crc kubenswrapper[4675]: I0216 19:45:38.005598 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nqmcd" Feb 16 19:45:38 crc kubenswrapper[4675]: I0216 19:45:38.173771 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96436f43-1dfe-4bba-98d5-ee0f45f78415-utilities\") pod \"96436f43-1dfe-4bba-98d5-ee0f45f78415\" (UID: \"96436f43-1dfe-4bba-98d5-ee0f45f78415\") " Feb 16 19:45:38 crc kubenswrapper[4675]: I0216 19:45:38.174223 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z85nv\" (UniqueName: \"kubernetes.io/projected/96436f43-1dfe-4bba-98d5-ee0f45f78415-kube-api-access-z85nv\") pod \"96436f43-1dfe-4bba-98d5-ee0f45f78415\" (UID: \"96436f43-1dfe-4bba-98d5-ee0f45f78415\") " Feb 16 19:45:38 crc kubenswrapper[4675]: I0216 19:45:38.174283 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96436f43-1dfe-4bba-98d5-ee0f45f78415-catalog-content\") pod \"96436f43-1dfe-4bba-98d5-ee0f45f78415\" (UID: \"96436f43-1dfe-4bba-98d5-ee0f45f78415\") " Feb 16 19:45:38 crc kubenswrapper[4675]: I0216 19:45:38.175047 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96436f43-1dfe-4bba-98d5-ee0f45f78415-utilities" (OuterVolumeSpecName: "utilities") pod "96436f43-1dfe-4bba-98d5-ee0f45f78415" (UID: "96436f43-1dfe-4bba-98d5-ee0f45f78415"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 19:45:38 crc kubenswrapper[4675]: I0216 19:45:38.175966 4675 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96436f43-1dfe-4bba-98d5-ee0f45f78415-utilities\") on node \"crc\" DevicePath \"\"" Feb 16 19:45:38 crc kubenswrapper[4675]: I0216 19:45:38.188000 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96436f43-1dfe-4bba-98d5-ee0f45f78415-kube-api-access-z85nv" (OuterVolumeSpecName: "kube-api-access-z85nv") pod "96436f43-1dfe-4bba-98d5-ee0f45f78415" (UID: "96436f43-1dfe-4bba-98d5-ee0f45f78415"). InnerVolumeSpecName "kube-api-access-z85nv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 19:45:38 crc kubenswrapper[4675]: I0216 19:45:38.277283 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z85nv\" (UniqueName: \"kubernetes.io/projected/96436f43-1dfe-4bba-98d5-ee0f45f78415-kube-api-access-z85nv\") on node \"crc\" DevicePath \"\"" Feb 16 19:45:38 crc kubenswrapper[4675]: I0216 19:45:38.300599 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96436f43-1dfe-4bba-98d5-ee0f45f78415-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "96436f43-1dfe-4bba-98d5-ee0f45f78415" (UID: "96436f43-1dfe-4bba-98d5-ee0f45f78415"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 19:45:38 crc kubenswrapper[4675]: I0216 19:45:38.378450 4675 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96436f43-1dfe-4bba-98d5-ee0f45f78415-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 16 19:45:38 crc kubenswrapper[4675]: I0216 19:45:38.948127 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nqmcd" Feb 16 19:45:38 crc kubenswrapper[4675]: I0216 19:45:38.948144 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nqmcd" event={"ID":"96436f43-1dfe-4bba-98d5-ee0f45f78415","Type":"ContainerDied","Data":"f3b67a9728eaa915754ce8443088933880a3ad5d6785ead893481d49647dba42"} Feb 16 19:45:38 crc kubenswrapper[4675]: I0216 19:45:38.948226 4675 scope.go:117] "RemoveContainer" containerID="0b3c8c43fb4f6ed25f069d5090f2a09e29cd1b2fb61055470c6d56117dc1560e" Feb 16 19:45:38 crc kubenswrapper[4675]: I0216 19:45:38.976525 4675 scope.go:117] "RemoveContainer" containerID="fb2c8d0e9d900a1e511afdb320a9532c52ac05e656217e8d076fff94db24638f" Feb 16 19:45:38 crc kubenswrapper[4675]: I0216 19:45:38.981727 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nqmcd"] Feb 16 19:45:38 crc kubenswrapper[4675]: I0216 19:45:38.986944 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-nqmcd"] Feb 16 19:45:38 crc kubenswrapper[4675]: I0216 19:45:38.998443 4675 scope.go:117] "RemoveContainer" containerID="bc6738db2c5ede095f2c58dc545f5a958d65f642903af6f5db67c49011e419df" Feb 16 19:45:39 crc kubenswrapper[4675]: I0216 19:45:39.484829 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-h6msp" Feb 16 19:45:39 crc kubenswrapper[4675]: I0216 19:45:39.777555 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bbn8r" Feb 16 19:45:39 crc kubenswrapper[4675]: I0216 19:45:39.891848 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96436f43-1dfe-4bba-98d5-ee0f45f78415" path="/var/lib/kubelet/pods/96436f43-1dfe-4bba-98d5-ee0f45f78415/volumes" Feb 16 19:45:41 crc kubenswrapper[4675]: I0216 19:45:41.716448 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bbn8r"] Feb 16 19:45:41 crc kubenswrapper[4675]: I0216 19:45:41.717070 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-bbn8r" podUID="edd3f360-2ab8-4310-ac1a-3473a48bd5ae" containerName="registry-server" containerID="cri-o://be5cf4f9a7b6873d51e9f73e7841c3ddf0ea4356b75217fcfddd827a437bff5e" gracePeriod=2 Feb 16 19:45:41 crc kubenswrapper[4675]: I0216 19:45:41.973216 4675 generic.go:334] "Generic (PLEG): container finished" podID="edd3f360-2ab8-4310-ac1a-3473a48bd5ae" containerID="be5cf4f9a7b6873d51e9f73e7841c3ddf0ea4356b75217fcfddd827a437bff5e" exitCode=0 Feb 16 19:45:41 crc kubenswrapper[4675]: I0216 19:45:41.973310 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bbn8r" event={"ID":"edd3f360-2ab8-4310-ac1a-3473a48bd5ae","Type":"ContainerDied","Data":"be5cf4f9a7b6873d51e9f73e7841c3ddf0ea4356b75217fcfddd827a437bff5e"} Feb 16 19:45:42 crc kubenswrapper[4675]: I0216 19:45:42.117235 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bbn8r" Feb 16 19:45:42 crc kubenswrapper[4675]: I0216 19:45:42.236447 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ccc57\" (UniqueName: \"kubernetes.io/projected/edd3f360-2ab8-4310-ac1a-3473a48bd5ae-kube-api-access-ccc57\") pod \"edd3f360-2ab8-4310-ac1a-3473a48bd5ae\" (UID: \"edd3f360-2ab8-4310-ac1a-3473a48bd5ae\") " Feb 16 19:45:42 crc kubenswrapper[4675]: I0216 19:45:42.236576 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edd3f360-2ab8-4310-ac1a-3473a48bd5ae-utilities\") pod \"edd3f360-2ab8-4310-ac1a-3473a48bd5ae\" (UID: \"edd3f360-2ab8-4310-ac1a-3473a48bd5ae\") " Feb 16 19:45:42 crc kubenswrapper[4675]: I0216 19:45:42.236644 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edd3f360-2ab8-4310-ac1a-3473a48bd5ae-catalog-content\") pod \"edd3f360-2ab8-4310-ac1a-3473a48bd5ae\" (UID: \"edd3f360-2ab8-4310-ac1a-3473a48bd5ae\") " Feb 16 19:45:42 crc kubenswrapper[4675]: I0216 19:45:42.237490 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/edd3f360-2ab8-4310-ac1a-3473a48bd5ae-utilities" (OuterVolumeSpecName: "utilities") pod "edd3f360-2ab8-4310-ac1a-3473a48bd5ae" (UID: "edd3f360-2ab8-4310-ac1a-3473a48bd5ae"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 19:45:42 crc kubenswrapper[4675]: I0216 19:45:42.242651 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edd3f360-2ab8-4310-ac1a-3473a48bd5ae-kube-api-access-ccc57" (OuterVolumeSpecName: "kube-api-access-ccc57") pod "edd3f360-2ab8-4310-ac1a-3473a48bd5ae" (UID: "edd3f360-2ab8-4310-ac1a-3473a48bd5ae"). InnerVolumeSpecName "kube-api-access-ccc57". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 19:45:42 crc kubenswrapper[4675]: I0216 19:45:42.287071 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/edd3f360-2ab8-4310-ac1a-3473a48bd5ae-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "edd3f360-2ab8-4310-ac1a-3473a48bd5ae" (UID: "edd3f360-2ab8-4310-ac1a-3473a48bd5ae"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 19:45:42 crc kubenswrapper[4675]: I0216 19:45:42.338186 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ccc57\" (UniqueName: \"kubernetes.io/projected/edd3f360-2ab8-4310-ac1a-3473a48bd5ae-kube-api-access-ccc57\") on node \"crc\" DevicePath \"\"" Feb 16 19:45:42 crc kubenswrapper[4675]: I0216 19:45:42.338228 4675 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edd3f360-2ab8-4310-ac1a-3473a48bd5ae-utilities\") on node \"crc\" DevicePath \"\"" Feb 16 19:45:42 crc kubenswrapper[4675]: I0216 19:45:42.338238 4675 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edd3f360-2ab8-4310-ac1a-3473a48bd5ae-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 16 19:45:42 crc kubenswrapper[4675]: I0216 19:45:42.473460 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kfrnk" Feb 16 19:45:42 crc kubenswrapper[4675]: I0216 19:45:42.516131 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kfrnk" Feb 16 19:45:42 crc kubenswrapper[4675]: I0216 19:45:42.983621 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bbn8r" event={"ID":"edd3f360-2ab8-4310-ac1a-3473a48bd5ae","Type":"ContainerDied","Data":"e1a580971a4a7f1ad3fead21ac66f6be49eef253bbdb140c59935c084123a160"} Feb 16 19:45:42 crc kubenswrapper[4675]: I0216 19:45:42.983660 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bbn8r" Feb 16 19:45:42 crc kubenswrapper[4675]: I0216 19:45:42.983756 4675 scope.go:117] "RemoveContainer" containerID="be5cf4f9a7b6873d51e9f73e7841c3ddf0ea4356b75217fcfddd827a437bff5e" Feb 16 19:45:43 crc kubenswrapper[4675]: I0216 19:45:43.009929 4675 scope.go:117] "RemoveContainer" containerID="e7c213a51a79b37eaf84b9f493031f15913047e95fb3c30fc2937779ba69353b" Feb 16 19:45:43 crc kubenswrapper[4675]: I0216 19:45:43.046225 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bbn8r"] Feb 16 19:45:43 crc kubenswrapper[4675]: I0216 19:45:43.049018 4675 scope.go:117] "RemoveContainer" containerID="16ff49807bf4c43bcd5696854ceb33ebd19d5f1afa99f2673f2701c561ec009f" Feb 16 19:45:43 crc kubenswrapper[4675]: I0216 19:45:43.049602 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-bbn8r"] Feb 16 19:45:43 crc kubenswrapper[4675]: I0216 19:45:43.892948 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="edd3f360-2ab8-4310-ac1a-3473a48bd5ae" path="/var/lib/kubelet/pods/edd3f360-2ab8-4310-ac1a-3473a48bd5ae/volumes" Feb 16 19:45:47 crc kubenswrapper[4675]: I0216 19:45:47.554589 4675 patch_prober.go:28] interesting pod/machine-config-daemon-j7pnb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 16 19:45:47 crc kubenswrapper[4675]: I0216 19:45:47.554729 4675 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" podUID="10414964-83d0-4d95-a89f-e3212a8015b5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 16 19:45:47 crc kubenswrapper[4675]: I0216 19:45:47.554833 4675 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" Feb 16 19:45:47 crc kubenswrapper[4675]: I0216 19:45:47.555870 4675 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"92e33c01bc9214841f42d4ca6e58fb062a9e94fe39bdc5ad08ad2351cd270058"} pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 16 19:45:47 crc kubenswrapper[4675]: I0216 19:45:47.555962 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" podUID="10414964-83d0-4d95-a89f-e3212a8015b5" containerName="machine-config-daemon" containerID="cri-o://92e33c01bc9214841f42d4ca6e58fb062a9e94fe39bdc5ad08ad2351cd270058" gracePeriod=600 Feb 16 19:45:48 crc kubenswrapper[4675]: I0216 19:45:48.031261 4675 generic.go:334] "Generic (PLEG): container finished" podID="10414964-83d0-4d95-a89f-e3212a8015b5" containerID="92e33c01bc9214841f42d4ca6e58fb062a9e94fe39bdc5ad08ad2351cd270058" exitCode=0 Feb 16 19:45:48 crc kubenswrapper[4675]: I0216 19:45:48.031317 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" event={"ID":"10414964-83d0-4d95-a89f-e3212a8015b5","Type":"ContainerDied","Data":"92e33c01bc9214841f42d4ca6e58fb062a9e94fe39bdc5ad08ad2351cd270058"} Feb 16 19:45:48 crc kubenswrapper[4675]: I0216 19:45:48.031353 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" event={"ID":"10414964-83d0-4d95-a89f-e3212a8015b5","Type":"ContainerStarted","Data":"8f5e8d3ef71727a87301b2848355647df663baf2a84aa1256862d3fc0de85ce6"} Feb 16 19:45:50 crc kubenswrapper[4675]: I0216 19:45:50.088908 4675 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 16 19:45:50 crc kubenswrapper[4675]: E0216 19:45:50.093498 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96436f43-1dfe-4bba-98d5-ee0f45f78415" containerName="extract-content" Feb 16 19:45:50 crc kubenswrapper[4675]: I0216 19:45:50.093542 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="96436f43-1dfe-4bba-98d5-ee0f45f78415" containerName="extract-content" Feb 16 19:45:50 crc kubenswrapper[4675]: E0216 19:45:50.093575 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edd3f360-2ab8-4310-ac1a-3473a48bd5ae" containerName="registry-server" Feb 16 19:45:50 crc kubenswrapper[4675]: I0216 19:45:50.093585 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="edd3f360-2ab8-4310-ac1a-3473a48bd5ae" containerName="registry-server" Feb 16 19:45:50 crc kubenswrapper[4675]: E0216 19:45:50.093610 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96436f43-1dfe-4bba-98d5-ee0f45f78415" containerName="extract-utilities" Feb 16 19:45:50 crc kubenswrapper[4675]: I0216 19:45:50.093622 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="96436f43-1dfe-4bba-98d5-ee0f45f78415" containerName="extract-utilities" Feb 16 19:45:50 crc kubenswrapper[4675]: E0216 19:45:50.093634 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edd3f360-2ab8-4310-ac1a-3473a48bd5ae" containerName="extract-utilities" Feb 16 19:45:50 crc kubenswrapper[4675]: I0216 19:45:50.093643 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="edd3f360-2ab8-4310-ac1a-3473a48bd5ae" containerName="extract-utilities" Feb 16 19:45:50 crc kubenswrapper[4675]: E0216 19:45:50.095265 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edd3f360-2ab8-4310-ac1a-3473a48bd5ae" containerName="extract-content" Feb 16 19:45:50 crc kubenswrapper[4675]: I0216 19:45:50.095283 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="edd3f360-2ab8-4310-ac1a-3473a48bd5ae" containerName="extract-content" Feb 16 19:45:50 crc kubenswrapper[4675]: E0216 19:45:50.095302 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96436f43-1dfe-4bba-98d5-ee0f45f78415" containerName="registry-server" Feb 16 19:45:50 crc kubenswrapper[4675]: I0216 19:45:50.095310 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="96436f43-1dfe-4bba-98d5-ee0f45f78415" containerName="registry-server" Feb 16 19:45:50 crc kubenswrapper[4675]: I0216 19:45:50.095654 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="edd3f360-2ab8-4310-ac1a-3473a48bd5ae" containerName="registry-server" Feb 16 19:45:50 crc kubenswrapper[4675]: I0216 19:45:50.095674 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="96436f43-1dfe-4bba-98d5-ee0f45f78415" containerName="registry-server" Feb 16 19:45:50 crc kubenswrapper[4675]: I0216 19:45:50.096396 4675 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 16 19:45:50 crc kubenswrapper[4675]: I0216 19:45:50.096448 4675 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 16 19:45:50 crc kubenswrapper[4675]: E0216 19:45:50.096950 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 16 19:45:50 crc kubenswrapper[4675]: I0216 19:45:50.096968 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 16 19:45:50 crc kubenswrapper[4675]: E0216 19:45:50.096989 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 16 19:45:50 crc kubenswrapper[4675]: I0216 19:45:50.096999 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 16 19:45:50 crc kubenswrapper[4675]: E0216 19:45:50.097013 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 16 19:45:50 crc kubenswrapper[4675]: I0216 19:45:50.097039 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 16 19:45:50 crc kubenswrapper[4675]: E0216 19:45:50.097057 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 16 19:45:50 crc kubenswrapper[4675]: I0216 19:45:50.097065 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 16 19:45:50 crc kubenswrapper[4675]: E0216 19:45:50.097074 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 16 19:45:50 crc kubenswrapper[4675]: I0216 19:45:50.097083 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 16 19:45:50 crc kubenswrapper[4675]: E0216 19:45:50.097103 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 16 19:45:50 crc kubenswrapper[4675]: I0216 19:45:50.097111 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 16 19:45:50 crc kubenswrapper[4675]: E0216 19:45:50.097130 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 16 19:45:50 crc kubenswrapper[4675]: I0216 19:45:50.097142 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 16 19:45:50 crc kubenswrapper[4675]: I0216 19:45:50.097424 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 16 19:45:50 crc kubenswrapper[4675]: I0216 19:45:50.097437 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 16 19:45:50 crc kubenswrapper[4675]: I0216 19:45:50.097448 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 16 19:45:50 crc kubenswrapper[4675]: I0216 19:45:50.097461 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 16 19:45:50 crc kubenswrapper[4675]: I0216 19:45:50.097477 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 16 19:45:50 crc kubenswrapper[4675]: I0216 19:45:50.098045 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 16 19:45:50 crc kubenswrapper[4675]: I0216 19:45:50.098205 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 16 19:45:50 crc kubenswrapper[4675]: I0216 19:45:50.100084 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://055f2d1052fed3e5e8d660243c3d0595e3168a670ea7a22ef51b199528ffe826" gracePeriod=15 Feb 16 19:45:50 crc kubenswrapper[4675]: I0216 19:45:50.100463 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://0cc4f98152f284333d41071adaefada973e3396c3cdb2c6ee3c662f0043fe65b" gracePeriod=15 Feb 16 19:45:50 crc kubenswrapper[4675]: I0216 19:45:50.100772 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://3beefddcc0b5e2e84c15063f467b9043ba9caea11fbc3f7fb8fc1ece3b8c8f75" gracePeriod=15 Feb 16 19:45:50 crc kubenswrapper[4675]: I0216 19:45:50.100875 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://344f8bd7bee340cc040a2556a0e92c055ed7fbad886f85750b66d21ace766e8a" gracePeriod=15 Feb 16 19:45:50 crc kubenswrapper[4675]: I0216 19:45:50.100879 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://0dbd2e87a95c7964a1bafb765ec76d7f18e180cc0e47df9fde7b526b3717a107" gracePeriod=15 Feb 16 19:45:50 crc kubenswrapper[4675]: I0216 19:45:50.103404 4675 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Feb 16 19:45:50 crc kubenswrapper[4675]: I0216 19:45:50.153302 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 16 19:45:50 crc kubenswrapper[4675]: E0216 19:45:50.254029 4675 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-conmon-0dbd2e87a95c7964a1bafb765ec76d7f18e180cc0e47df9fde7b526b3717a107.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-0cc4f98152f284333d41071adaefada973e3396c3cdb2c6ee3c662f0043fe65b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-conmon-0cc4f98152f284333d41071adaefada973e3396c3cdb2c6ee3c662f0043fe65b.scope\": RecentStats: unable to find data in memory cache]" Feb 16 19:45:50 crc kubenswrapper[4675]: I0216 19:45:50.266440 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 16 19:45:50 crc kubenswrapper[4675]: I0216 19:45:50.266499 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 16 19:45:50 crc kubenswrapper[4675]: I0216 19:45:50.266543 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 16 19:45:50 crc kubenswrapper[4675]: I0216 19:45:50.266561 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 16 19:45:50 crc kubenswrapper[4675]: I0216 19:45:50.266593 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 16 19:45:50 crc kubenswrapper[4675]: I0216 19:45:50.266616 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 16 19:45:50 crc kubenswrapper[4675]: I0216 19:45:50.266634 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 16 19:45:50 crc kubenswrapper[4675]: I0216 19:45:50.266716 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 16 19:45:50 crc kubenswrapper[4675]: I0216 19:45:50.368148 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 16 19:45:50 crc kubenswrapper[4675]: I0216 19:45:50.368222 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 16 19:45:50 crc kubenswrapper[4675]: I0216 19:45:50.368268 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 16 19:45:50 crc kubenswrapper[4675]: I0216 19:45:50.368289 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 16 19:45:50 crc kubenswrapper[4675]: I0216 19:45:50.368293 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 16 19:45:50 crc kubenswrapper[4675]: I0216 19:45:50.368318 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 16 19:45:50 crc kubenswrapper[4675]: I0216 19:45:50.368342 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 16 19:45:50 crc kubenswrapper[4675]: I0216 19:45:50.368380 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 16 19:45:50 crc kubenswrapper[4675]: I0216 19:45:50.368385 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 16 19:45:50 crc kubenswrapper[4675]: I0216 19:45:50.368415 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 16 19:45:50 crc kubenswrapper[4675]: I0216 19:45:50.368434 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 16 19:45:50 crc kubenswrapper[4675]: I0216 19:45:50.368466 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 16 19:45:50 crc kubenswrapper[4675]: I0216 19:45:50.368460 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 16 19:45:50 crc kubenswrapper[4675]: I0216 19:45:50.368513 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 16 19:45:50 crc kubenswrapper[4675]: I0216 19:45:50.368573 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 16 19:45:50 crc kubenswrapper[4675]: I0216 19:45:50.368440 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 16 19:45:50 crc kubenswrapper[4675]: I0216 19:45:50.451512 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 16 19:45:50 crc kubenswrapper[4675]: W0216 19:45:50.472869 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-b3ee15db1606b8353fb08ff68933ae7d02fd89407dcd7ccacbd9288df4b7352b WatchSource:0}: Error finding container b3ee15db1606b8353fb08ff68933ae7d02fd89407dcd7ccacbd9288df4b7352b: Status 404 returned error can't find the container with id b3ee15db1606b8353fb08ff68933ae7d02fd89407dcd7ccacbd9288df4b7352b Feb 16 19:45:50 crc kubenswrapper[4675]: E0216 19:45:50.477229 4675 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.177:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1894d1ba0d0b52d9 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-16 19:45:50.476415705 +0000 UTC m=+233.601705261,LastTimestamp:2026-02-16 19:45:50.476415705 +0000 UTC m=+233.601705261,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 16 19:45:51 crc kubenswrapper[4675]: I0216 19:45:51.064032 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"2adb10031936044102ec0d994b68def7e135ca8be994850c22c4189b70849986"} Feb 16 19:45:51 crc kubenswrapper[4675]: I0216 19:45:51.064103 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"b3ee15db1606b8353fb08ff68933ae7d02fd89407dcd7ccacbd9288df4b7352b"} Feb 16 19:45:51 crc kubenswrapper[4675]: I0216 19:45:51.067107 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 16 19:45:51 crc kubenswrapper[4675]: I0216 19:45:51.067161 4675 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.177:6443: connect: connection refused" Feb 16 19:45:51 crc kubenswrapper[4675]: I0216 19:45:51.069890 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 16 19:45:51 crc kubenswrapper[4675]: I0216 19:45:51.070676 4675 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="0cc4f98152f284333d41071adaefada973e3396c3cdb2c6ee3c662f0043fe65b" exitCode=0 Feb 16 19:45:51 crc kubenswrapper[4675]: I0216 19:45:51.070735 4675 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="344f8bd7bee340cc040a2556a0e92c055ed7fbad886f85750b66d21ace766e8a" exitCode=0 Feb 16 19:45:51 crc kubenswrapper[4675]: I0216 19:45:51.070744 4675 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="3beefddcc0b5e2e84c15063f467b9043ba9caea11fbc3f7fb8fc1ece3b8c8f75" exitCode=0 Feb 16 19:45:51 crc kubenswrapper[4675]: I0216 19:45:51.070753 4675 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="0dbd2e87a95c7964a1bafb765ec76d7f18e180cc0e47df9fde7b526b3717a107" exitCode=2 Feb 16 19:45:51 crc kubenswrapper[4675]: I0216 19:45:51.070829 4675 scope.go:117] "RemoveContainer" containerID="c762d1facdba3325880ae6b1e2704c626ae67915808b527959795774e2a1ba62" Feb 16 19:45:51 crc kubenswrapper[4675]: I0216 19:45:51.073614 4675 generic.go:334] "Generic (PLEG): container finished" podID="d51e99ec-7a52-4a98-83b0-9f3948b3c957" containerID="8dfb8f83a9674f93466547d8541922a76d7f9348c01c7828013924c816927b43" exitCode=0 Feb 16 19:45:51 crc kubenswrapper[4675]: I0216 19:45:51.073721 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"d51e99ec-7a52-4a98-83b0-9f3948b3c957","Type":"ContainerDied","Data":"8dfb8f83a9674f93466547d8541922a76d7f9348c01c7828013924c816927b43"} Feb 16 19:45:51 crc kubenswrapper[4675]: I0216 19:45:51.074593 4675 status_manager.go:851] "Failed to get status for pod" podUID="d51e99ec-7a52-4a98-83b0-9f3948b3c957" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.177:6443: connect: connection refused" Feb 16 19:45:51 crc kubenswrapper[4675]: I0216 19:45:51.075168 4675 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.177:6443: connect: connection refused" Feb 16 19:45:52 crc kubenswrapper[4675]: I0216 19:45:52.085010 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 16 19:45:52 crc kubenswrapper[4675]: I0216 19:45:52.539006 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 16 19:45:52 crc kubenswrapper[4675]: I0216 19:45:52.540554 4675 status_manager.go:851] "Failed to get status for pod" podUID="d51e99ec-7a52-4a98-83b0-9f3948b3c957" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.177:6443: connect: connection refused" Feb 16 19:45:52 crc kubenswrapper[4675]: I0216 19:45:52.541074 4675 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.177:6443: connect: connection refused" Feb 16 19:45:52 crc kubenswrapper[4675]: I0216 19:45:52.547958 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 16 19:45:52 crc kubenswrapper[4675]: I0216 19:45:52.548689 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 16 19:45:52 crc kubenswrapper[4675]: I0216 19:45:52.549224 4675 status_manager.go:851] "Failed to get status for pod" podUID="d51e99ec-7a52-4a98-83b0-9f3948b3c957" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.177:6443: connect: connection refused" Feb 16 19:45:52 crc kubenswrapper[4675]: I0216 19:45:52.549438 4675 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.177:6443: connect: connection refused" Feb 16 19:45:52 crc kubenswrapper[4675]: I0216 19:45:52.549684 4675 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.177:6443: connect: connection refused" Feb 16 19:45:52 crc kubenswrapper[4675]: I0216 19:45:52.613432 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 16 19:45:52 crc kubenswrapper[4675]: I0216 19:45:52.613485 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 16 19:45:52 crc kubenswrapper[4675]: I0216 19:45:52.613514 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d51e99ec-7a52-4a98-83b0-9f3948b3c957-kube-api-access\") pod \"d51e99ec-7a52-4a98-83b0-9f3948b3c957\" (UID: \"d51e99ec-7a52-4a98-83b0-9f3948b3c957\") " Feb 16 19:45:52 crc kubenswrapper[4675]: I0216 19:45:52.613557 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 16 19:45:52 crc kubenswrapper[4675]: I0216 19:45:52.613582 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d51e99ec-7a52-4a98-83b0-9f3948b3c957-kubelet-dir\") pod \"d51e99ec-7a52-4a98-83b0-9f3948b3c957\" (UID: \"d51e99ec-7a52-4a98-83b0-9f3948b3c957\") " Feb 16 19:45:52 crc kubenswrapper[4675]: I0216 19:45:52.613620 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d51e99ec-7a52-4a98-83b0-9f3948b3c957-var-lock\") pod \"d51e99ec-7a52-4a98-83b0-9f3948b3c957\" (UID: \"d51e99ec-7a52-4a98-83b0-9f3948b3c957\") " Feb 16 19:45:52 crc kubenswrapper[4675]: I0216 19:45:52.613890 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d51e99ec-7a52-4a98-83b0-9f3948b3c957-var-lock" (OuterVolumeSpecName: "var-lock") pod "d51e99ec-7a52-4a98-83b0-9f3948b3c957" (UID: "d51e99ec-7a52-4a98-83b0-9f3948b3c957"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 16 19:45:52 crc kubenswrapper[4675]: I0216 19:45:52.613935 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d51e99ec-7a52-4a98-83b0-9f3948b3c957-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "d51e99ec-7a52-4a98-83b0-9f3948b3c957" (UID: "d51e99ec-7a52-4a98-83b0-9f3948b3c957"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 16 19:45:52 crc kubenswrapper[4675]: I0216 19:45:52.613960 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 16 19:45:52 crc kubenswrapper[4675]: I0216 19:45:52.613979 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 16 19:45:52 crc kubenswrapper[4675]: I0216 19:45:52.614003 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 16 19:45:52 crc kubenswrapper[4675]: I0216 19:45:52.624294 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d51e99ec-7a52-4a98-83b0-9f3948b3c957-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "d51e99ec-7a52-4a98-83b0-9f3948b3c957" (UID: "d51e99ec-7a52-4a98-83b0-9f3948b3c957"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 19:45:52 crc kubenswrapper[4675]: I0216 19:45:52.715437 4675 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Feb 16 19:45:52 crc kubenswrapper[4675]: I0216 19:45:52.715498 4675 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 16 19:45:52 crc kubenswrapper[4675]: I0216 19:45:52.715524 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d51e99ec-7a52-4a98-83b0-9f3948b3c957-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 16 19:45:52 crc kubenswrapper[4675]: I0216 19:45:52.715551 4675 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 16 19:45:52 crc kubenswrapper[4675]: I0216 19:45:52.715574 4675 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d51e99ec-7a52-4a98-83b0-9f3948b3c957-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 16 19:45:52 crc kubenswrapper[4675]: I0216 19:45:52.715595 4675 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d51e99ec-7a52-4a98-83b0-9f3948b3c957-var-lock\") on node \"crc\" DevicePath \"\"" Feb 16 19:45:53 crc kubenswrapper[4675]: I0216 19:45:53.107780 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 16 19:45:53 crc kubenswrapper[4675]: I0216 19:45:53.110009 4675 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="055f2d1052fed3e5e8d660243c3d0595e3168a670ea7a22ef51b199528ffe826" exitCode=0 Feb 16 19:45:53 crc kubenswrapper[4675]: I0216 19:45:53.110184 4675 scope.go:117] "RemoveContainer" containerID="0cc4f98152f284333d41071adaefada973e3396c3cdb2c6ee3c662f0043fe65b" Feb 16 19:45:53 crc kubenswrapper[4675]: I0216 19:45:53.110442 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 16 19:45:53 crc kubenswrapper[4675]: I0216 19:45:53.112786 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"d51e99ec-7a52-4a98-83b0-9f3948b3c957","Type":"ContainerDied","Data":"1d6578f181c23d21fb6c24bec6519610d46af0693f9f6f7c5bbcc6b939f9f692"} Feb 16 19:45:53 crc kubenswrapper[4675]: I0216 19:45:53.112899 4675 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1d6578f181c23d21fb6c24bec6519610d46af0693f9f6f7c5bbcc6b939f9f692" Feb 16 19:45:53 crc kubenswrapper[4675]: I0216 19:45:53.112915 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 16 19:45:53 crc kubenswrapper[4675]: I0216 19:45:53.144188 4675 scope.go:117] "RemoveContainer" containerID="344f8bd7bee340cc040a2556a0e92c055ed7fbad886f85750b66d21ace766e8a" Feb 16 19:45:53 crc kubenswrapper[4675]: I0216 19:45:53.149379 4675 status_manager.go:851] "Failed to get status for pod" podUID="d51e99ec-7a52-4a98-83b0-9f3948b3c957" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.177:6443: connect: connection refused" Feb 16 19:45:53 crc kubenswrapper[4675]: I0216 19:45:53.149860 4675 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.177:6443: connect: connection refused" Feb 16 19:45:53 crc kubenswrapper[4675]: I0216 19:45:53.151273 4675 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.177:6443: connect: connection refused" Feb 16 19:45:53 crc kubenswrapper[4675]: I0216 19:45:53.151632 4675 status_manager.go:851] "Failed to get status for pod" podUID="d51e99ec-7a52-4a98-83b0-9f3948b3c957" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.177:6443: connect: connection refused" Feb 16 19:45:53 crc kubenswrapper[4675]: I0216 19:45:53.152015 4675 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.177:6443: connect: connection refused" Feb 16 19:45:53 crc kubenswrapper[4675]: I0216 19:45:53.152342 4675 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.177:6443: connect: connection refused" Feb 16 19:45:53 crc kubenswrapper[4675]: I0216 19:45:53.162961 4675 scope.go:117] "RemoveContainer" containerID="3beefddcc0b5e2e84c15063f467b9043ba9caea11fbc3f7fb8fc1ece3b8c8f75" Feb 16 19:45:53 crc kubenswrapper[4675]: I0216 19:45:53.190844 4675 scope.go:117] "RemoveContainer" containerID="0dbd2e87a95c7964a1bafb765ec76d7f18e180cc0e47df9fde7b526b3717a107" Feb 16 19:45:53 crc kubenswrapper[4675]: I0216 19:45:53.215520 4675 scope.go:117] "RemoveContainer" containerID="055f2d1052fed3e5e8d660243c3d0595e3168a670ea7a22ef51b199528ffe826" Feb 16 19:45:53 crc kubenswrapper[4675]: I0216 19:45:53.233436 4675 scope.go:117] "RemoveContainer" containerID="e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375" Feb 16 19:45:53 crc kubenswrapper[4675]: I0216 19:45:53.273811 4675 scope.go:117] "RemoveContainer" containerID="0cc4f98152f284333d41071adaefada973e3396c3cdb2c6ee3c662f0043fe65b" Feb 16 19:45:53 crc kubenswrapper[4675]: E0216 19:45:53.276214 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0cc4f98152f284333d41071adaefada973e3396c3cdb2c6ee3c662f0043fe65b\": container with ID starting with 0cc4f98152f284333d41071adaefada973e3396c3cdb2c6ee3c662f0043fe65b not found: ID does not exist" containerID="0cc4f98152f284333d41071adaefada973e3396c3cdb2c6ee3c662f0043fe65b" Feb 16 19:45:53 crc kubenswrapper[4675]: I0216 19:45:53.276352 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cc4f98152f284333d41071adaefada973e3396c3cdb2c6ee3c662f0043fe65b"} err="failed to get container status \"0cc4f98152f284333d41071adaefada973e3396c3cdb2c6ee3c662f0043fe65b\": rpc error: code = NotFound desc = could not find container \"0cc4f98152f284333d41071adaefada973e3396c3cdb2c6ee3c662f0043fe65b\": container with ID starting with 0cc4f98152f284333d41071adaefada973e3396c3cdb2c6ee3c662f0043fe65b not found: ID does not exist" Feb 16 19:45:53 crc kubenswrapper[4675]: I0216 19:45:53.276459 4675 scope.go:117] "RemoveContainer" containerID="344f8bd7bee340cc040a2556a0e92c055ed7fbad886f85750b66d21ace766e8a" Feb 16 19:45:53 crc kubenswrapper[4675]: E0216 19:45:53.277071 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"344f8bd7bee340cc040a2556a0e92c055ed7fbad886f85750b66d21ace766e8a\": container with ID starting with 344f8bd7bee340cc040a2556a0e92c055ed7fbad886f85750b66d21ace766e8a not found: ID does not exist" containerID="344f8bd7bee340cc040a2556a0e92c055ed7fbad886f85750b66d21ace766e8a" Feb 16 19:45:53 crc kubenswrapper[4675]: I0216 19:45:53.277164 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"344f8bd7bee340cc040a2556a0e92c055ed7fbad886f85750b66d21ace766e8a"} err="failed to get container status \"344f8bd7bee340cc040a2556a0e92c055ed7fbad886f85750b66d21ace766e8a\": rpc error: code = NotFound desc = could not find container \"344f8bd7bee340cc040a2556a0e92c055ed7fbad886f85750b66d21ace766e8a\": container with ID starting with 344f8bd7bee340cc040a2556a0e92c055ed7fbad886f85750b66d21ace766e8a not found: ID does not exist" Feb 16 19:45:53 crc kubenswrapper[4675]: I0216 19:45:53.277289 4675 scope.go:117] "RemoveContainer" containerID="3beefddcc0b5e2e84c15063f467b9043ba9caea11fbc3f7fb8fc1ece3b8c8f75" Feb 16 19:45:53 crc kubenswrapper[4675]: E0216 19:45:53.278037 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3beefddcc0b5e2e84c15063f467b9043ba9caea11fbc3f7fb8fc1ece3b8c8f75\": container with ID starting with 3beefddcc0b5e2e84c15063f467b9043ba9caea11fbc3f7fb8fc1ece3b8c8f75 not found: ID does not exist" containerID="3beefddcc0b5e2e84c15063f467b9043ba9caea11fbc3f7fb8fc1ece3b8c8f75" Feb 16 19:45:53 crc kubenswrapper[4675]: I0216 19:45:53.278094 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3beefddcc0b5e2e84c15063f467b9043ba9caea11fbc3f7fb8fc1ece3b8c8f75"} err="failed to get container status \"3beefddcc0b5e2e84c15063f467b9043ba9caea11fbc3f7fb8fc1ece3b8c8f75\": rpc error: code = NotFound desc = could not find container \"3beefddcc0b5e2e84c15063f467b9043ba9caea11fbc3f7fb8fc1ece3b8c8f75\": container with ID starting with 3beefddcc0b5e2e84c15063f467b9043ba9caea11fbc3f7fb8fc1ece3b8c8f75 not found: ID does not exist" Feb 16 19:45:53 crc kubenswrapper[4675]: I0216 19:45:53.278129 4675 scope.go:117] "RemoveContainer" containerID="0dbd2e87a95c7964a1bafb765ec76d7f18e180cc0e47df9fde7b526b3717a107" Feb 16 19:45:53 crc kubenswrapper[4675]: E0216 19:45:53.279346 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0dbd2e87a95c7964a1bafb765ec76d7f18e180cc0e47df9fde7b526b3717a107\": container with ID starting with 0dbd2e87a95c7964a1bafb765ec76d7f18e180cc0e47df9fde7b526b3717a107 not found: ID does not exist" containerID="0dbd2e87a95c7964a1bafb765ec76d7f18e180cc0e47df9fde7b526b3717a107" Feb 16 19:45:53 crc kubenswrapper[4675]: I0216 19:45:53.279399 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0dbd2e87a95c7964a1bafb765ec76d7f18e180cc0e47df9fde7b526b3717a107"} err="failed to get container status \"0dbd2e87a95c7964a1bafb765ec76d7f18e180cc0e47df9fde7b526b3717a107\": rpc error: code = NotFound desc = could not find container \"0dbd2e87a95c7964a1bafb765ec76d7f18e180cc0e47df9fde7b526b3717a107\": container with ID starting with 0dbd2e87a95c7964a1bafb765ec76d7f18e180cc0e47df9fde7b526b3717a107 not found: ID does not exist" Feb 16 19:45:53 crc kubenswrapper[4675]: I0216 19:45:53.279437 4675 scope.go:117] "RemoveContainer" containerID="055f2d1052fed3e5e8d660243c3d0595e3168a670ea7a22ef51b199528ffe826" Feb 16 19:45:53 crc kubenswrapper[4675]: E0216 19:45:53.279887 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"055f2d1052fed3e5e8d660243c3d0595e3168a670ea7a22ef51b199528ffe826\": container with ID starting with 055f2d1052fed3e5e8d660243c3d0595e3168a670ea7a22ef51b199528ffe826 not found: ID does not exist" containerID="055f2d1052fed3e5e8d660243c3d0595e3168a670ea7a22ef51b199528ffe826" Feb 16 19:45:53 crc kubenswrapper[4675]: I0216 19:45:53.279914 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"055f2d1052fed3e5e8d660243c3d0595e3168a670ea7a22ef51b199528ffe826"} err="failed to get container status \"055f2d1052fed3e5e8d660243c3d0595e3168a670ea7a22ef51b199528ffe826\": rpc error: code = NotFound desc = could not find container \"055f2d1052fed3e5e8d660243c3d0595e3168a670ea7a22ef51b199528ffe826\": container with ID starting with 055f2d1052fed3e5e8d660243c3d0595e3168a670ea7a22ef51b199528ffe826 not found: ID does not exist" Feb 16 19:45:53 crc kubenswrapper[4675]: I0216 19:45:53.279934 4675 scope.go:117] "RemoveContainer" containerID="e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375" Feb 16 19:45:53 crc kubenswrapper[4675]: E0216 19:45:53.280331 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375\": container with ID starting with e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375 not found: ID does not exist" containerID="e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375" Feb 16 19:45:53 crc kubenswrapper[4675]: I0216 19:45:53.280451 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375"} err="failed to get container status \"e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375\": rpc error: code = NotFound desc = could not find container \"e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375\": container with ID starting with e8d02ff95555956c2d70f990ae0da13779b951070e565dffbde5c873909fa375 not found: ID does not exist" Feb 16 19:45:53 crc kubenswrapper[4675]: I0216 19:45:53.892214 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Feb 16 19:45:57 crc kubenswrapper[4675]: I0216 19:45:57.896289 4675 status_manager.go:851] "Failed to get status for pod" podUID="d51e99ec-7a52-4a98-83b0-9f3948b3c957" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.177:6443: connect: connection refused" Feb 16 19:45:57 crc kubenswrapper[4675]: I0216 19:45:57.898030 4675 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.177:6443: connect: connection refused" Feb 16 19:45:57 crc kubenswrapper[4675]: E0216 19:45:57.933365 4675 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.177:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1894d1ba0d0b52d9 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-16 19:45:50.476415705 +0000 UTC m=+233.601705261,LastTimestamp:2026-02-16 19:45:50.476415705 +0000 UTC m=+233.601705261,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 16 19:45:59 crc kubenswrapper[4675]: E0216 19:45:59.273417 4675 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.177:6443: connect: connection refused" Feb 16 19:45:59 crc kubenswrapper[4675]: E0216 19:45:59.275046 4675 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.177:6443: connect: connection refused" Feb 16 19:45:59 crc kubenswrapper[4675]: E0216 19:45:59.275424 4675 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.177:6443: connect: connection refused" Feb 16 19:45:59 crc kubenswrapper[4675]: E0216 19:45:59.275842 4675 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.177:6443: connect: connection refused" Feb 16 19:45:59 crc kubenswrapper[4675]: E0216 19:45:59.276200 4675 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.177:6443: connect: connection refused" Feb 16 19:45:59 crc kubenswrapper[4675]: I0216 19:45:59.276248 4675 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Feb 16 19:45:59 crc kubenswrapper[4675]: E0216 19:45:59.276547 4675 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.177:6443: connect: connection refused" interval="200ms" Feb 16 19:45:59 crc kubenswrapper[4675]: E0216 19:45:59.477489 4675 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.177:6443: connect: connection refused" interval="400ms" Feb 16 19:45:59 crc kubenswrapper[4675]: E0216 19:45:59.878884 4675 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.177:6443: connect: connection refused" interval="800ms" Feb 16 19:46:00 crc kubenswrapper[4675]: E0216 19:46:00.324570 4675 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:46:00Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:46:00Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:46:00Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T19:46:00Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.177:6443: connect: connection refused" Feb 16 19:46:00 crc kubenswrapper[4675]: E0216 19:46:00.325261 4675 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.177:6443: connect: connection refused" Feb 16 19:46:00 crc kubenswrapper[4675]: E0216 19:46:00.325606 4675 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.177:6443: connect: connection refused" Feb 16 19:46:00 crc kubenswrapper[4675]: E0216 19:46:00.325996 4675 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.177:6443: connect: connection refused" Feb 16 19:46:00 crc kubenswrapper[4675]: E0216 19:46:00.326361 4675 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.177:6443: connect: connection refused" Feb 16 19:46:00 crc kubenswrapper[4675]: E0216 19:46:00.326400 4675 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 16 19:46:00 crc kubenswrapper[4675]: E0216 19:46:00.680423 4675 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.177:6443: connect: connection refused" interval="1.6s" Feb 16 19:46:02 crc kubenswrapper[4675]: E0216 19:46:02.299409 4675 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.177:6443: connect: connection refused" interval="3.2s" Feb 16 19:46:04 crc kubenswrapper[4675]: I0216 19:46:04.191660 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 16 19:46:04 crc kubenswrapper[4675]: I0216 19:46:04.191763 4675 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="abb4dfa515be84263cef6de8c300018193409908292cbfe0d19610ade9532ad0" exitCode=1 Feb 16 19:46:04 crc kubenswrapper[4675]: I0216 19:46:04.191809 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"abb4dfa515be84263cef6de8c300018193409908292cbfe0d19610ade9532ad0"} Feb 16 19:46:04 crc kubenswrapper[4675]: I0216 19:46:04.192306 4675 scope.go:117] "RemoveContainer" containerID="abb4dfa515be84263cef6de8c300018193409908292cbfe0d19610ade9532ad0" Feb 16 19:46:04 crc kubenswrapper[4675]: I0216 19:46:04.194202 4675 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.177:6443: connect: connection refused" Feb 16 19:46:04 crc kubenswrapper[4675]: I0216 19:46:04.194989 4675 status_manager.go:851] "Failed to get status for pod" podUID="d51e99ec-7a52-4a98-83b0-9f3948b3c957" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.177:6443: connect: connection refused" Feb 16 19:46:04 crc kubenswrapper[4675]: I0216 19:46:04.195387 4675 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.177:6443: connect: connection refused" Feb 16 19:46:04 crc kubenswrapper[4675]: I0216 19:46:04.884307 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 16 19:46:04 crc kubenswrapper[4675]: I0216 19:46:04.885839 4675 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.177:6443: connect: connection refused" Feb 16 19:46:04 crc kubenswrapper[4675]: I0216 19:46:04.886284 4675 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.177:6443: connect: connection refused" Feb 16 19:46:04 crc kubenswrapper[4675]: I0216 19:46:04.886911 4675 status_manager.go:851] "Failed to get status for pod" podUID="d51e99ec-7a52-4a98-83b0-9f3948b3c957" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.177:6443: connect: connection refused" Feb 16 19:46:04 crc kubenswrapper[4675]: I0216 19:46:04.900635 4675 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="cc6bac98-79c6-4970-ba8b-7996a4046f7d" Feb 16 19:46:04 crc kubenswrapper[4675]: I0216 19:46:04.900702 4675 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="cc6bac98-79c6-4970-ba8b-7996a4046f7d" Feb 16 19:46:04 crc kubenswrapper[4675]: E0216 19:46:04.901226 4675 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.177:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 16 19:46:04 crc kubenswrapper[4675]: I0216 19:46:04.901706 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 16 19:46:04 crc kubenswrapper[4675]: W0216 19:46:04.921876 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-7b727e4f84241af26472ad0a9f7638a73481a29c097a1f292b458cf5eafdc00e WatchSource:0}: Error finding container 7b727e4f84241af26472ad0a9f7638a73481a29c097a1f292b458cf5eafdc00e: Status 404 returned error can't find the container with id 7b727e4f84241af26472ad0a9f7638a73481a29c097a1f292b458cf5eafdc00e Feb 16 19:46:05 crc kubenswrapper[4675]: I0216 19:46:05.203371 4675 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="20c323891320321e8b33340fa3ec230811b4fabcfb7e5b59187fb002017a83f3" exitCode=0 Feb 16 19:46:05 crc kubenswrapper[4675]: I0216 19:46:05.203511 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"20c323891320321e8b33340fa3ec230811b4fabcfb7e5b59187fb002017a83f3"} Feb 16 19:46:05 crc kubenswrapper[4675]: I0216 19:46:05.203560 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"7b727e4f84241af26472ad0a9f7638a73481a29c097a1f292b458cf5eafdc00e"} Feb 16 19:46:05 crc kubenswrapper[4675]: I0216 19:46:05.204022 4675 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="cc6bac98-79c6-4970-ba8b-7996a4046f7d" Feb 16 19:46:05 crc kubenswrapper[4675]: I0216 19:46:05.204047 4675 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="cc6bac98-79c6-4970-ba8b-7996a4046f7d" Feb 16 19:46:05 crc kubenswrapper[4675]: E0216 19:46:05.204974 4675 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.177:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 16 19:46:05 crc kubenswrapper[4675]: I0216 19:46:05.205573 4675 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.177:6443: connect: connection refused" Feb 16 19:46:05 crc kubenswrapper[4675]: I0216 19:46:05.205988 4675 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.177:6443: connect: connection refused" Feb 16 19:46:05 crc kubenswrapper[4675]: I0216 19:46:05.206392 4675 status_manager.go:851] "Failed to get status for pod" podUID="d51e99ec-7a52-4a98-83b0-9f3948b3c957" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.177:6443: connect: connection refused" Feb 16 19:46:05 crc kubenswrapper[4675]: I0216 19:46:05.216323 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 16 19:46:05 crc kubenswrapper[4675]: I0216 19:46:05.216432 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ef311d14609aabded4acc7ab084a11e9b7d8745d98c4af6038d1ac16827e9512"} Feb 16 19:46:05 crc kubenswrapper[4675]: I0216 19:46:05.217820 4675 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.177:6443: connect: connection refused" Feb 16 19:46:05 crc kubenswrapper[4675]: I0216 19:46:05.218324 4675 status_manager.go:851] "Failed to get status for pod" podUID="d51e99ec-7a52-4a98-83b0-9f3948b3c957" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.177:6443: connect: connection refused" Feb 16 19:46:05 crc kubenswrapper[4675]: I0216 19:46:05.219188 4675 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.177:6443: connect: connection refused" Feb 16 19:46:05 crc kubenswrapper[4675]: E0216 19:46:05.509083 4675 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.177:6443: connect: connection refused" interval="6.4s" Feb 16 19:46:05 crc kubenswrapper[4675]: I0216 19:46:05.605985 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 16 19:46:05 crc kubenswrapper[4675]: I0216 19:46:05.606869 4675 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 16 19:46:05 crc kubenswrapper[4675]: I0216 19:46:05.606964 4675 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 16 19:46:06 crc kubenswrapper[4675]: I0216 19:46:06.230780 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"4c1cd6dce1c893c7445e4bdee9bb30c16f7b8092374338d6a586fcff7712bd5b"} Feb 16 19:46:06 crc kubenswrapper[4675]: I0216 19:46:06.230864 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"1f010e642c26fed66a257a6effcede5079baa3d02195901b17e6c20b2152be7a"} Feb 16 19:46:06 crc kubenswrapper[4675]: I0216 19:46:06.230878 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"085f869a59eca994ffa84ab2496e6b15b453721801f1ffce2cb18e062e48820e"} Feb 16 19:46:07 crc kubenswrapper[4675]: I0216 19:46:07.241731 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"90a589ab43689b1079c40ce3b924f2fc3b5b995a2d21eed39774f09130fbe750"} Feb 16 19:46:07 crc kubenswrapper[4675]: I0216 19:46:07.242164 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"1223896380ed94f810ccbd1883a41a48a429621d402dd06a5fd173cecae2e658"} Feb 16 19:46:07 crc kubenswrapper[4675]: I0216 19:46:07.242181 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 16 19:46:07 crc kubenswrapper[4675]: I0216 19:46:07.242111 4675 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="cc6bac98-79c6-4970-ba8b-7996a4046f7d" Feb 16 19:46:07 crc kubenswrapper[4675]: I0216 19:46:07.242202 4675 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="cc6bac98-79c6-4970-ba8b-7996a4046f7d" Feb 16 19:46:09 crc kubenswrapper[4675]: I0216 19:46:09.902208 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 16 19:46:09 crc kubenswrapper[4675]: I0216 19:46:09.902753 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 16 19:46:09 crc kubenswrapper[4675]: I0216 19:46:09.908351 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 16 19:46:11 crc kubenswrapper[4675]: I0216 19:46:11.067478 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 16 19:46:12 crc kubenswrapper[4675]: I0216 19:46:12.259941 4675 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 16 19:46:12 crc kubenswrapper[4675]: I0216 19:46:12.391384 4675 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="1f89b390-3769-4f8a-b17a-e0c713fefa01" Feb 16 19:46:13 crc kubenswrapper[4675]: I0216 19:46:13.282580 4675 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="cc6bac98-79c6-4970-ba8b-7996a4046f7d" Feb 16 19:46:13 crc kubenswrapper[4675]: I0216 19:46:13.283218 4675 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="cc6bac98-79c6-4970-ba8b-7996a4046f7d" Feb 16 19:46:13 crc kubenswrapper[4675]: I0216 19:46:13.287986 4675 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="1f89b390-3769-4f8a-b17a-e0c713fefa01" Feb 16 19:46:13 crc kubenswrapper[4675]: I0216 19:46:13.288446 4675 status_manager.go:308] "Container readiness changed before pod has synced" pod="openshift-kube-apiserver/kube-apiserver-crc" containerID="cri-o://085f869a59eca994ffa84ab2496e6b15b453721801f1ffce2cb18e062e48820e" Feb 16 19:46:13 crc kubenswrapper[4675]: I0216 19:46:13.288538 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 16 19:46:14 crc kubenswrapper[4675]: I0216 19:46:14.289270 4675 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="cc6bac98-79c6-4970-ba8b-7996a4046f7d" Feb 16 19:46:14 crc kubenswrapper[4675]: I0216 19:46:14.289308 4675 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="cc6bac98-79c6-4970-ba8b-7996a4046f7d" Feb 16 19:46:14 crc kubenswrapper[4675]: I0216 19:46:14.294069 4675 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="1f89b390-3769-4f8a-b17a-e0c713fefa01" Feb 16 19:46:15 crc kubenswrapper[4675]: I0216 19:46:15.596615 4675 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 16 19:46:15 crc kubenswrapper[4675]: I0216 19:46:15.596753 4675 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 16 19:46:21 crc kubenswrapper[4675]: I0216 19:46:21.672593 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 16 19:46:21 crc kubenswrapper[4675]: I0216 19:46:21.847579 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 16 19:46:22 crc kubenswrapper[4675]: I0216 19:46:22.380255 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 16 19:46:22 crc kubenswrapper[4675]: I0216 19:46:22.712647 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 16 19:46:22 crc kubenswrapper[4675]: I0216 19:46:22.759004 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 16 19:46:22 crc kubenswrapper[4675]: I0216 19:46:22.859205 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 16 19:46:22 crc kubenswrapper[4675]: I0216 19:46:22.861580 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 16 19:46:22 crc kubenswrapper[4675]: I0216 19:46:22.939967 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 16 19:46:23 crc kubenswrapper[4675]: I0216 19:46:23.215394 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 16 19:46:23 crc kubenswrapper[4675]: I0216 19:46:23.576288 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 16 19:46:23 crc kubenswrapper[4675]: I0216 19:46:23.601876 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 16 19:46:23 crc kubenswrapper[4675]: I0216 19:46:23.637055 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 16 19:46:23 crc kubenswrapper[4675]: I0216 19:46:23.702787 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 16 19:46:23 crc kubenswrapper[4675]: I0216 19:46:23.731980 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 16 19:46:23 crc kubenswrapper[4675]: I0216 19:46:23.854545 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 16 19:46:23 crc kubenswrapper[4675]: I0216 19:46:23.910042 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 16 19:46:23 crc kubenswrapper[4675]: I0216 19:46:23.927682 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 16 19:46:24 crc kubenswrapper[4675]: I0216 19:46:24.024579 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 16 19:46:24 crc kubenswrapper[4675]: I0216 19:46:24.047210 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 16 19:46:24 crc kubenswrapper[4675]: I0216 19:46:24.093427 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 16 19:46:24 crc kubenswrapper[4675]: I0216 19:46:24.167382 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 16 19:46:24 crc kubenswrapper[4675]: I0216 19:46:24.188413 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 16 19:46:24 crc kubenswrapper[4675]: I0216 19:46:24.247367 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 16 19:46:24 crc kubenswrapper[4675]: I0216 19:46:24.254639 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 16 19:46:24 crc kubenswrapper[4675]: I0216 19:46:24.285833 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 16 19:46:24 crc kubenswrapper[4675]: I0216 19:46:24.350397 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 16 19:46:24 crc kubenswrapper[4675]: I0216 19:46:24.385139 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 16 19:46:24 crc kubenswrapper[4675]: I0216 19:46:24.443237 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 16 19:46:24 crc kubenswrapper[4675]: I0216 19:46:24.550521 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 16 19:46:24 crc kubenswrapper[4675]: I0216 19:46:24.616735 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 16 19:46:24 crc kubenswrapper[4675]: I0216 19:46:24.682119 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 16 19:46:24 crc kubenswrapper[4675]: I0216 19:46:24.942809 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 16 19:46:25 crc kubenswrapper[4675]: I0216 19:46:25.034943 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 16 19:46:25 crc kubenswrapper[4675]: I0216 19:46:25.135984 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 16 19:46:25 crc kubenswrapper[4675]: I0216 19:46:25.209666 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 16 19:46:25 crc kubenswrapper[4675]: I0216 19:46:25.332729 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 16 19:46:25 crc kubenswrapper[4675]: I0216 19:46:25.340509 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 16 19:46:25 crc kubenswrapper[4675]: I0216 19:46:25.421563 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 16 19:46:25 crc kubenswrapper[4675]: I0216 19:46:25.448038 4675 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 16 19:46:25 crc kubenswrapper[4675]: I0216 19:46:25.467127 4675 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 16 19:46:25 crc kubenswrapper[4675]: I0216 19:46:25.471346 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=35.471327803 podStartE2EDuration="35.471327803s" podCreationTimestamp="2026-02-16 19:45:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 19:46:12.277097409 +0000 UTC m=+255.402386975" watchObservedRunningTime="2026-02-16 19:46:25.471327803 +0000 UTC m=+268.596617359" Feb 16 19:46:25 crc kubenswrapper[4675]: I0216 19:46:25.471555 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 16 19:46:25 crc kubenswrapper[4675]: I0216 19:46:25.471601 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 16 19:46:25 crc kubenswrapper[4675]: I0216 19:46:25.476299 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 16 19:46:25 crc kubenswrapper[4675]: I0216 19:46:25.497957 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 16 19:46:25 crc kubenswrapper[4675]: I0216 19:46:25.517144 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 16 19:46:25 crc kubenswrapper[4675]: I0216 19:46:25.517784 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=13.517676679000001 podStartE2EDuration="13.517676679s" podCreationTimestamp="2026-02-16 19:46:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 19:46:25.493775642 +0000 UTC m=+268.619065208" watchObservedRunningTime="2026-02-16 19:46:25.517676679 +0000 UTC m=+268.642966245" Feb 16 19:46:25 crc kubenswrapper[4675]: I0216 19:46:25.599604 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 16 19:46:25 crc kubenswrapper[4675]: I0216 19:46:25.603136 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 16 19:46:25 crc kubenswrapper[4675]: I0216 19:46:25.604416 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 16 19:46:25 crc kubenswrapper[4675]: I0216 19:46:25.785889 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 16 19:46:25 crc kubenswrapper[4675]: I0216 19:46:25.853414 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 16 19:46:25 crc kubenswrapper[4675]: I0216 19:46:25.854058 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 16 19:46:25 crc kubenswrapper[4675]: I0216 19:46:25.906030 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 16 19:46:26 crc kubenswrapper[4675]: I0216 19:46:26.023214 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 16 19:46:26 crc kubenswrapper[4675]: I0216 19:46:26.038390 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 16 19:46:26 crc kubenswrapper[4675]: I0216 19:46:26.047639 4675 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 16 19:46:26 crc kubenswrapper[4675]: I0216 19:46:26.058486 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 16 19:46:26 crc kubenswrapper[4675]: I0216 19:46:26.058584 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 16 19:46:26 crc kubenswrapper[4675]: I0216 19:46:26.124623 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 16 19:46:26 crc kubenswrapper[4675]: I0216 19:46:26.149180 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 16 19:46:26 crc kubenswrapper[4675]: I0216 19:46:26.168845 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 16 19:46:26 crc kubenswrapper[4675]: I0216 19:46:26.217799 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 16 19:46:26 crc kubenswrapper[4675]: I0216 19:46:26.274575 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 16 19:46:26 crc kubenswrapper[4675]: I0216 19:46:26.300449 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 16 19:46:26 crc kubenswrapper[4675]: I0216 19:46:26.303337 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 16 19:46:26 crc kubenswrapper[4675]: I0216 19:46:26.339495 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 16 19:46:26 crc kubenswrapper[4675]: I0216 19:46:26.339829 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 16 19:46:26 crc kubenswrapper[4675]: I0216 19:46:26.361772 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 16 19:46:26 crc kubenswrapper[4675]: I0216 19:46:26.440930 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 16 19:46:26 crc kubenswrapper[4675]: I0216 19:46:26.470330 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 16 19:46:26 crc kubenswrapper[4675]: I0216 19:46:26.520049 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 16 19:46:26 crc kubenswrapper[4675]: I0216 19:46:26.583279 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 16 19:46:26 crc kubenswrapper[4675]: I0216 19:46:26.593680 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 16 19:46:26 crc kubenswrapper[4675]: I0216 19:46:26.653208 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 16 19:46:26 crc kubenswrapper[4675]: I0216 19:46:26.656655 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 16 19:46:26 crc kubenswrapper[4675]: I0216 19:46:26.680586 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 16 19:46:26 crc kubenswrapper[4675]: I0216 19:46:26.728029 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 16 19:46:26 crc kubenswrapper[4675]: I0216 19:46:26.840837 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 16 19:46:26 crc kubenswrapper[4675]: I0216 19:46:26.941009 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 16 19:46:26 crc kubenswrapper[4675]: I0216 19:46:26.951701 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 16 19:46:27 crc kubenswrapper[4675]: I0216 19:46:27.042712 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 16 19:46:27 crc kubenswrapper[4675]: I0216 19:46:27.080462 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 16 19:46:27 crc kubenswrapper[4675]: I0216 19:46:27.106836 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 16 19:46:27 crc kubenswrapper[4675]: I0216 19:46:27.151408 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 16 19:46:27 crc kubenswrapper[4675]: I0216 19:46:27.156742 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 16 19:46:27 crc kubenswrapper[4675]: I0216 19:46:27.215247 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 16 19:46:27 crc kubenswrapper[4675]: I0216 19:46:27.226497 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 16 19:46:27 crc kubenswrapper[4675]: I0216 19:46:27.282844 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 16 19:46:27 crc kubenswrapper[4675]: I0216 19:46:27.342253 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 16 19:46:27 crc kubenswrapper[4675]: I0216 19:46:27.393090 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 16 19:46:27 crc kubenswrapper[4675]: I0216 19:46:27.396011 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 16 19:46:27 crc kubenswrapper[4675]: I0216 19:46:27.396710 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 16 19:46:27 crc kubenswrapper[4675]: I0216 19:46:27.462127 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 16 19:46:27 crc kubenswrapper[4675]: I0216 19:46:27.479019 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 16 19:46:27 crc kubenswrapper[4675]: I0216 19:46:27.484130 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 16 19:46:27 crc kubenswrapper[4675]: I0216 19:46:27.523989 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 16 19:46:27 crc kubenswrapper[4675]: I0216 19:46:27.557431 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 16 19:46:27 crc kubenswrapper[4675]: I0216 19:46:27.620225 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 16 19:46:27 crc kubenswrapper[4675]: I0216 19:46:27.696703 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 16 19:46:27 crc kubenswrapper[4675]: I0216 19:46:27.716027 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 16 19:46:27 crc kubenswrapper[4675]: I0216 19:46:27.717562 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 16 19:46:27 crc kubenswrapper[4675]: I0216 19:46:27.841883 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 16 19:46:27 crc kubenswrapper[4675]: I0216 19:46:27.875870 4675 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 16 19:46:27 crc kubenswrapper[4675]: I0216 19:46:27.966026 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 16 19:46:27 crc kubenswrapper[4675]: I0216 19:46:27.985942 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 16 19:46:28 crc kubenswrapper[4675]: I0216 19:46:28.027293 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 16 19:46:28 crc kubenswrapper[4675]: I0216 19:46:28.103434 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 16 19:46:28 crc kubenswrapper[4675]: I0216 19:46:28.120408 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 16 19:46:28 crc kubenswrapper[4675]: I0216 19:46:28.160045 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 16 19:46:28 crc kubenswrapper[4675]: I0216 19:46:28.242176 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 16 19:46:28 crc kubenswrapper[4675]: I0216 19:46:28.261709 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 16 19:46:28 crc kubenswrapper[4675]: I0216 19:46:28.271431 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 16 19:46:28 crc kubenswrapper[4675]: I0216 19:46:28.307433 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 16 19:46:28 crc kubenswrapper[4675]: I0216 19:46:28.412308 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 16 19:46:28 crc kubenswrapper[4675]: I0216 19:46:28.423943 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 16 19:46:28 crc kubenswrapper[4675]: I0216 19:46:28.485021 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 16 19:46:28 crc kubenswrapper[4675]: I0216 19:46:28.566030 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 16 19:46:28 crc kubenswrapper[4675]: I0216 19:46:28.567770 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 16 19:46:28 crc kubenswrapper[4675]: I0216 19:46:28.597936 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 16 19:46:28 crc kubenswrapper[4675]: I0216 19:46:28.597954 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 16 19:46:28 crc kubenswrapper[4675]: I0216 19:46:28.600225 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 16 19:46:28 crc kubenswrapper[4675]: I0216 19:46:28.692957 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 16 19:46:28 crc kubenswrapper[4675]: I0216 19:46:28.759375 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 16 19:46:28 crc kubenswrapper[4675]: I0216 19:46:28.772933 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 16 19:46:28 crc kubenswrapper[4675]: I0216 19:46:28.827194 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 16 19:46:28 crc kubenswrapper[4675]: I0216 19:46:28.856849 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 16 19:46:28 crc kubenswrapper[4675]: I0216 19:46:28.866415 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 16 19:46:29 crc kubenswrapper[4675]: I0216 19:46:29.064139 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 16 19:46:29 crc kubenswrapper[4675]: I0216 19:46:29.065060 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 16 19:46:29 crc kubenswrapper[4675]: I0216 19:46:29.072557 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 16 19:46:29 crc kubenswrapper[4675]: I0216 19:46:29.256580 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 16 19:46:29 crc kubenswrapper[4675]: I0216 19:46:29.339194 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 16 19:46:29 crc kubenswrapper[4675]: I0216 19:46:29.458753 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 16 19:46:29 crc kubenswrapper[4675]: I0216 19:46:29.464285 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 16 19:46:29 crc kubenswrapper[4675]: I0216 19:46:29.540287 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 16 19:46:29 crc kubenswrapper[4675]: I0216 19:46:29.559392 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 16 19:46:29 crc kubenswrapper[4675]: I0216 19:46:29.576487 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 16 19:46:29 crc kubenswrapper[4675]: I0216 19:46:29.664252 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 16 19:46:29 crc kubenswrapper[4675]: I0216 19:46:29.677235 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 16 19:46:29 crc kubenswrapper[4675]: I0216 19:46:29.721420 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 16 19:46:29 crc kubenswrapper[4675]: I0216 19:46:29.738297 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 16 19:46:29 crc kubenswrapper[4675]: I0216 19:46:29.926795 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 16 19:46:30 crc kubenswrapper[4675]: I0216 19:46:30.205918 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 16 19:46:30 crc kubenswrapper[4675]: I0216 19:46:30.268500 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 16 19:46:30 crc kubenswrapper[4675]: I0216 19:46:30.287528 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 16 19:46:30 crc kubenswrapper[4675]: I0216 19:46:30.287748 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 16 19:46:30 crc kubenswrapper[4675]: I0216 19:46:30.306004 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 16 19:46:30 crc kubenswrapper[4675]: I0216 19:46:30.327575 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 16 19:46:30 crc kubenswrapper[4675]: I0216 19:46:30.432223 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 16 19:46:30 crc kubenswrapper[4675]: I0216 19:46:30.478906 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 16 19:46:30 crc kubenswrapper[4675]: I0216 19:46:30.566919 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 16 19:46:30 crc kubenswrapper[4675]: I0216 19:46:30.642531 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 16 19:46:30 crc kubenswrapper[4675]: I0216 19:46:30.715611 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 16 19:46:30 crc kubenswrapper[4675]: I0216 19:46:30.873752 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 16 19:46:30 crc kubenswrapper[4675]: I0216 19:46:30.883829 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 16 19:46:30 crc kubenswrapper[4675]: I0216 19:46:30.909213 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 16 19:46:30 crc kubenswrapper[4675]: I0216 19:46:30.990666 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 16 19:46:31 crc kubenswrapper[4675]: I0216 19:46:31.057837 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 16 19:46:31 crc kubenswrapper[4675]: I0216 19:46:31.107040 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 16 19:46:31 crc kubenswrapper[4675]: I0216 19:46:31.130794 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 16 19:46:31 crc kubenswrapper[4675]: I0216 19:46:31.181164 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 16 19:46:31 crc kubenswrapper[4675]: I0216 19:46:31.357255 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 16 19:46:31 crc kubenswrapper[4675]: I0216 19:46:31.432901 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 16 19:46:31 crc kubenswrapper[4675]: I0216 19:46:31.485516 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 16 19:46:31 crc kubenswrapper[4675]: I0216 19:46:31.574931 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 16 19:46:31 crc kubenswrapper[4675]: I0216 19:46:31.578278 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 16 19:46:31 crc kubenswrapper[4675]: I0216 19:46:31.636214 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 16 19:46:31 crc kubenswrapper[4675]: I0216 19:46:31.690640 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 16 19:46:31 crc kubenswrapper[4675]: I0216 19:46:31.691826 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 16 19:46:31 crc kubenswrapper[4675]: I0216 19:46:31.769222 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 16 19:46:31 crc kubenswrapper[4675]: I0216 19:46:31.807145 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 16 19:46:31 crc kubenswrapper[4675]: I0216 19:46:31.824844 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 16 19:46:31 crc kubenswrapper[4675]: I0216 19:46:31.834357 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 16 19:46:31 crc kubenswrapper[4675]: I0216 19:46:31.987457 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 16 19:46:31 crc kubenswrapper[4675]: I0216 19:46:31.990447 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 16 19:46:31 crc kubenswrapper[4675]: I0216 19:46:31.992274 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 16 19:46:32 crc kubenswrapper[4675]: I0216 19:46:32.060281 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 16 19:46:32 crc kubenswrapper[4675]: I0216 19:46:32.060676 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 16 19:46:32 crc kubenswrapper[4675]: I0216 19:46:32.104412 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 16 19:46:32 crc kubenswrapper[4675]: I0216 19:46:32.166868 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 16 19:46:32 crc kubenswrapper[4675]: I0216 19:46:32.176271 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 16 19:46:32 crc kubenswrapper[4675]: I0216 19:46:32.224285 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 16 19:46:32 crc kubenswrapper[4675]: I0216 19:46:32.334302 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 16 19:46:32 crc kubenswrapper[4675]: I0216 19:46:32.352882 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 16 19:46:32 crc kubenswrapper[4675]: I0216 19:46:32.518260 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 16 19:46:32 crc kubenswrapper[4675]: I0216 19:46:32.518312 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 16 19:46:32 crc kubenswrapper[4675]: I0216 19:46:32.575175 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 16 19:46:32 crc kubenswrapper[4675]: I0216 19:46:32.609345 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 16 19:46:32 crc kubenswrapper[4675]: I0216 19:46:32.628009 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 16 19:46:32 crc kubenswrapper[4675]: I0216 19:46:32.646742 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 16 19:46:32 crc kubenswrapper[4675]: I0216 19:46:32.772448 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 16 19:46:32 crc kubenswrapper[4675]: I0216 19:46:32.976867 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 16 19:46:33 crc kubenswrapper[4675]: I0216 19:46:33.026730 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 16 19:46:33 crc kubenswrapper[4675]: I0216 19:46:33.074720 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 16 19:46:33 crc kubenswrapper[4675]: I0216 19:46:33.118149 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 16 19:46:33 crc kubenswrapper[4675]: I0216 19:46:33.157525 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 16 19:46:33 crc kubenswrapper[4675]: I0216 19:46:33.235550 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 16 19:46:33 crc kubenswrapper[4675]: I0216 19:46:33.326759 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 16 19:46:33 crc kubenswrapper[4675]: I0216 19:46:33.326873 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 16 19:46:33 crc kubenswrapper[4675]: I0216 19:46:33.439573 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 16 19:46:33 crc kubenswrapper[4675]: I0216 19:46:33.477616 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 16 19:46:33 crc kubenswrapper[4675]: I0216 19:46:33.509605 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 16 19:46:33 crc kubenswrapper[4675]: I0216 19:46:33.517969 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 16 19:46:33 crc kubenswrapper[4675]: I0216 19:46:33.596596 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 16 19:46:33 crc kubenswrapper[4675]: I0216 19:46:33.606583 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 16 19:46:33 crc kubenswrapper[4675]: I0216 19:46:33.631353 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 16 19:46:33 crc kubenswrapper[4675]: I0216 19:46:33.673165 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 16 19:46:33 crc kubenswrapper[4675]: I0216 19:46:33.872664 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 16 19:46:33 crc kubenswrapper[4675]: I0216 19:46:33.902798 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 16 19:46:33 crc kubenswrapper[4675]: I0216 19:46:33.932267 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 16 19:46:34 crc kubenswrapper[4675]: I0216 19:46:34.093425 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 16 19:46:34 crc kubenswrapper[4675]: I0216 19:46:34.180623 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 16 19:46:34 crc kubenswrapper[4675]: I0216 19:46:34.305459 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 16 19:46:34 crc kubenswrapper[4675]: I0216 19:46:34.326152 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 16 19:46:34 crc kubenswrapper[4675]: I0216 19:46:34.374314 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 16 19:46:34 crc kubenswrapper[4675]: I0216 19:46:34.608065 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 16 19:46:34 crc kubenswrapper[4675]: I0216 19:46:34.611818 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 16 19:46:34 crc kubenswrapper[4675]: I0216 19:46:34.644594 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 16 19:46:34 crc kubenswrapper[4675]: I0216 19:46:34.651879 4675 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 16 19:46:34 crc kubenswrapper[4675]: I0216 19:46:34.652140 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://2adb10031936044102ec0d994b68def7e135ca8be994850c22c4189b70849986" gracePeriod=5 Feb 16 19:46:34 crc kubenswrapper[4675]: I0216 19:46:34.789389 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 16 19:46:34 crc kubenswrapper[4675]: I0216 19:46:34.818498 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 16 19:46:34 crc kubenswrapper[4675]: I0216 19:46:34.945655 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 16 19:46:34 crc kubenswrapper[4675]: I0216 19:46:34.968739 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 16 19:46:35 crc kubenswrapper[4675]: I0216 19:46:35.153991 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 16 19:46:35 crc kubenswrapper[4675]: I0216 19:46:35.261117 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 16 19:46:35 crc kubenswrapper[4675]: I0216 19:46:35.352602 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 16 19:46:35 crc kubenswrapper[4675]: I0216 19:46:35.359235 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 16 19:46:35 crc kubenswrapper[4675]: I0216 19:46:35.445528 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 16 19:46:35 crc kubenswrapper[4675]: I0216 19:46:35.481327 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 16 19:46:35 crc kubenswrapper[4675]: I0216 19:46:35.515878 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 16 19:46:35 crc kubenswrapper[4675]: I0216 19:46:35.568647 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 16 19:46:35 crc kubenswrapper[4675]: I0216 19:46:35.590763 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 16 19:46:35 crc kubenswrapper[4675]: I0216 19:46:35.945761 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 16 19:46:35 crc kubenswrapper[4675]: I0216 19:46:35.946565 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 16 19:46:36 crc kubenswrapper[4675]: I0216 19:46:36.122419 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 16 19:46:36 crc kubenswrapper[4675]: I0216 19:46:36.416811 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 16 19:46:36 crc kubenswrapper[4675]: I0216 19:46:36.451209 4675 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 16 19:46:36 crc kubenswrapper[4675]: I0216 19:46:36.563866 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 16 19:46:36 crc kubenswrapper[4675]: I0216 19:46:36.699627 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 16 19:46:36 crc kubenswrapper[4675]: I0216 19:46:36.710189 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 16 19:46:36 crc kubenswrapper[4675]: I0216 19:46:36.763632 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 16 19:46:36 crc kubenswrapper[4675]: I0216 19:46:36.798310 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 16 19:46:36 crc kubenswrapper[4675]: I0216 19:46:36.920523 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 16 19:46:36 crc kubenswrapper[4675]: I0216 19:46:36.924157 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 16 19:46:36 crc kubenswrapper[4675]: I0216 19:46:36.959089 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 16 19:46:36 crc kubenswrapper[4675]: I0216 19:46:36.998167 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 16 19:46:37 crc kubenswrapper[4675]: I0216 19:46:37.047195 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 16 19:46:37 crc kubenswrapper[4675]: I0216 19:46:37.061596 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 16 19:46:37 crc kubenswrapper[4675]: I0216 19:46:37.151169 4675 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 16 19:46:37 crc kubenswrapper[4675]: I0216 19:46:37.153739 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 16 19:46:37 crc kubenswrapper[4675]: I0216 19:46:37.318727 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 16 19:46:37 crc kubenswrapper[4675]: I0216 19:46:37.482027 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 16 19:46:37 crc kubenswrapper[4675]: I0216 19:46:37.528492 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 16 19:46:37 crc kubenswrapper[4675]: I0216 19:46:37.921765 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 16 19:46:37 crc kubenswrapper[4675]: I0216 19:46:37.952500 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 16 19:46:38 crc kubenswrapper[4675]: I0216 19:46:38.325923 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 16 19:46:38 crc kubenswrapper[4675]: I0216 19:46:38.336834 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 16 19:46:38 crc kubenswrapper[4675]: I0216 19:46:38.952845 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 16 19:46:40 crc kubenswrapper[4675]: I0216 19:46:40.230764 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 16 19:46:40 crc kubenswrapper[4675]: I0216 19:46:40.230868 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 16 19:46:40 crc kubenswrapper[4675]: I0216 19:46:40.253399 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 16 19:46:40 crc kubenswrapper[4675]: I0216 19:46:40.253495 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 16 19:46:40 crc kubenswrapper[4675]: I0216 19:46:40.253569 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 16 19:46:40 crc kubenswrapper[4675]: I0216 19:46:40.253592 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 16 19:46:40 crc kubenswrapper[4675]: I0216 19:46:40.253659 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 16 19:46:40 crc kubenswrapper[4675]: I0216 19:46:40.253630 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 16 19:46:40 crc kubenswrapper[4675]: I0216 19:46:40.253716 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 16 19:46:40 crc kubenswrapper[4675]: I0216 19:46:40.253789 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 16 19:46:40 crc kubenswrapper[4675]: I0216 19:46:40.253925 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 16 19:46:40 crc kubenswrapper[4675]: I0216 19:46:40.254125 4675 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Feb 16 19:46:40 crc kubenswrapper[4675]: I0216 19:46:40.254178 4675 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Feb 16 19:46:40 crc kubenswrapper[4675]: I0216 19:46:40.254200 4675 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 16 19:46:40 crc kubenswrapper[4675]: I0216 19:46:40.254221 4675 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Feb 16 19:46:40 crc kubenswrapper[4675]: I0216 19:46:40.264595 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 16 19:46:40 crc kubenswrapper[4675]: I0216 19:46:40.354645 4675 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 16 19:46:40 crc kubenswrapper[4675]: I0216 19:46:40.471425 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 16 19:46:40 crc kubenswrapper[4675]: I0216 19:46:40.471878 4675 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="2adb10031936044102ec0d994b68def7e135ca8be994850c22c4189b70849986" exitCode=137 Feb 16 19:46:40 crc kubenswrapper[4675]: I0216 19:46:40.471945 4675 scope.go:117] "RemoveContainer" containerID="2adb10031936044102ec0d994b68def7e135ca8be994850c22c4189b70849986" Feb 16 19:46:40 crc kubenswrapper[4675]: I0216 19:46:40.472094 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 16 19:46:40 crc kubenswrapper[4675]: I0216 19:46:40.504075 4675 scope.go:117] "RemoveContainer" containerID="2adb10031936044102ec0d994b68def7e135ca8be994850c22c4189b70849986" Feb 16 19:46:40 crc kubenswrapper[4675]: E0216 19:46:40.505263 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2adb10031936044102ec0d994b68def7e135ca8be994850c22c4189b70849986\": container with ID starting with 2adb10031936044102ec0d994b68def7e135ca8be994850c22c4189b70849986 not found: ID does not exist" containerID="2adb10031936044102ec0d994b68def7e135ca8be994850c22c4189b70849986" Feb 16 19:46:40 crc kubenswrapper[4675]: I0216 19:46:40.505500 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2adb10031936044102ec0d994b68def7e135ca8be994850c22c4189b70849986"} err="failed to get container status \"2adb10031936044102ec0d994b68def7e135ca8be994850c22c4189b70849986\": rpc error: code = NotFound desc = could not find container \"2adb10031936044102ec0d994b68def7e135ca8be994850c22c4189b70849986\": container with ID starting with 2adb10031936044102ec0d994b68def7e135ca8be994850c22c4189b70849986 not found: ID does not exist" Feb 16 19:46:41 crc kubenswrapper[4675]: I0216 19:46:41.890873 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Feb 16 19:46:41 crc kubenswrapper[4675]: I0216 19:46:41.892069 4675 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Feb 16 19:46:41 crc kubenswrapper[4675]: I0216 19:46:41.904272 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 16 19:46:41 crc kubenswrapper[4675]: I0216 19:46:41.904320 4675 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="5857d915-f4a8-4b90-83d2-e870ea83bffa" Feb 16 19:46:41 crc kubenswrapper[4675]: I0216 19:46:41.909125 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 16 19:46:41 crc kubenswrapper[4675]: I0216 19:46:41.909177 4675 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="5857d915-f4a8-4b90-83d2-e870ea83bffa" Feb 16 19:46:42 crc kubenswrapper[4675]: I0216 19:46:42.262875 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-grmd4"] Feb 16 19:46:42 crc kubenswrapper[4675]: I0216 19:46:42.263258 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-grmd4" podUID="ee9cd13e-3dd5-4fbf-8989-965350dff2e8" containerName="registry-server" containerID="cri-o://877a79e2937424f39a4867f6f4a274bd649ac2396a6e38ca7c52594e7de3f2c5" gracePeriod=30 Feb 16 19:46:42 crc kubenswrapper[4675]: I0216 19:46:42.275606 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-h6msp"] Feb 16 19:46:42 crc kubenswrapper[4675]: I0216 19:46:42.277164 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-h6msp" podUID="eb2e7e26-8ed8-4cec-8f78-7d02cce6bb33" containerName="registry-server" containerID="cri-o://144ee6fd7a97b4a71b0833920c5a41a5c5d867dcfb40cb217f8fb029acc45cab" gracePeriod=30 Feb 16 19:46:42 crc kubenswrapper[4675]: I0216 19:46:42.293790 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6kj4n"] Feb 16 19:46:42 crc kubenswrapper[4675]: I0216 19:46:42.294066 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-6kj4n" podUID="408894c5-c798-48ff-93ac-bc8ea114ee4a" containerName="marketplace-operator" containerID="cri-o://fb1bd0c01fe0d208c5c17d908853e15d6ee199b812c327ac4ce94792b244add8" gracePeriod=30 Feb 16 19:46:42 crc kubenswrapper[4675]: I0216 19:46:42.305869 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ndhwt"] Feb 16 19:46:42 crc kubenswrapper[4675]: I0216 19:46:42.306592 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ndhwt" podUID="fecbc375-b333-42c1-bff9-3d17abec2eb2" containerName="registry-server" containerID="cri-o://3d127274c6289a876eebbf2766615789d6b8503a82480dc02fc5374f7198afa1" gracePeriod=30 Feb 16 19:46:42 crc kubenswrapper[4675]: I0216 19:46:42.328324 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kfrnk"] Feb 16 19:46:42 crc kubenswrapper[4675]: I0216 19:46:42.328599 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-kfrnk" podUID="a833f00b-0fcb-416e-be3e-c4344adeef8d" containerName="registry-server" containerID="cri-o://d49e4d698d7921dd47b5de893a643c1130a48eb65a1d3c363b895c6528033815" gracePeriod=30 Feb 16 19:46:42 crc kubenswrapper[4675]: I0216 19:46:42.349757 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2pww7"] Feb 16 19:46:42 crc kubenswrapper[4675]: E0216 19:46:42.350383 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 16 19:46:42 crc kubenswrapper[4675]: I0216 19:46:42.350497 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 16 19:46:42 crc kubenswrapper[4675]: E0216 19:46:42.350609 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d51e99ec-7a52-4a98-83b0-9f3948b3c957" containerName="installer" Feb 16 19:46:42 crc kubenswrapper[4675]: I0216 19:46:42.350878 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="d51e99ec-7a52-4a98-83b0-9f3948b3c957" containerName="installer" Feb 16 19:46:42 crc kubenswrapper[4675]: I0216 19:46:42.351120 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 16 19:46:42 crc kubenswrapper[4675]: I0216 19:46:42.351224 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="d51e99ec-7a52-4a98-83b0-9f3948b3c957" containerName="installer" Feb 16 19:46:42 crc kubenswrapper[4675]: I0216 19:46:42.351950 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-2pww7" Feb 16 19:46:42 crc kubenswrapper[4675]: I0216 19:46:42.361802 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2pww7"] Feb 16 19:46:42 crc kubenswrapper[4675]: I0216 19:46:42.398781 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e7e5daa8-b9d8-4f3f-902e-36273ad65acb-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-2pww7\" (UID: \"e7e5daa8-b9d8-4f3f-902e-36273ad65acb\") " pod="openshift-marketplace/marketplace-operator-79b997595-2pww7" Feb 16 19:46:42 crc kubenswrapper[4675]: I0216 19:46:42.398854 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4n9rk\" (UniqueName: \"kubernetes.io/projected/e7e5daa8-b9d8-4f3f-902e-36273ad65acb-kube-api-access-4n9rk\") pod \"marketplace-operator-79b997595-2pww7\" (UID: \"e7e5daa8-b9d8-4f3f-902e-36273ad65acb\") " pod="openshift-marketplace/marketplace-operator-79b997595-2pww7" Feb 16 19:46:42 crc kubenswrapper[4675]: I0216 19:46:42.398884 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e7e5daa8-b9d8-4f3f-902e-36273ad65acb-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-2pww7\" (UID: \"e7e5daa8-b9d8-4f3f-902e-36273ad65acb\") " pod="openshift-marketplace/marketplace-operator-79b997595-2pww7" Feb 16 19:46:42 crc kubenswrapper[4675]: E0216 19:46:42.421080 4675 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d49e4d698d7921dd47b5de893a643c1130a48eb65a1d3c363b895c6528033815 is running failed: container process not found" containerID="d49e4d698d7921dd47b5de893a643c1130a48eb65a1d3c363b895c6528033815" cmd=["grpc_health_probe","-addr=:50051"] Feb 16 19:46:42 crc kubenswrapper[4675]: E0216 19:46:42.421606 4675 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d49e4d698d7921dd47b5de893a643c1130a48eb65a1d3c363b895c6528033815 is running failed: container process not found" containerID="d49e4d698d7921dd47b5de893a643c1130a48eb65a1d3c363b895c6528033815" cmd=["grpc_health_probe","-addr=:50051"] Feb 16 19:46:42 crc kubenswrapper[4675]: E0216 19:46:42.422579 4675 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d49e4d698d7921dd47b5de893a643c1130a48eb65a1d3c363b895c6528033815 is running failed: container process not found" containerID="d49e4d698d7921dd47b5de893a643c1130a48eb65a1d3c363b895c6528033815" cmd=["grpc_health_probe","-addr=:50051"] Feb 16 19:46:42 crc kubenswrapper[4675]: E0216 19:46:42.422619 4675 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d49e4d698d7921dd47b5de893a643c1130a48eb65a1d3c363b895c6528033815 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-operators-kfrnk" podUID="a833f00b-0fcb-416e-be3e-c4344adeef8d" containerName="registry-server" Feb 16 19:46:42 crc kubenswrapper[4675]: I0216 19:46:42.491353 4675 generic.go:334] "Generic (PLEG): container finished" podID="ee9cd13e-3dd5-4fbf-8989-965350dff2e8" containerID="877a79e2937424f39a4867f6f4a274bd649ac2396a6e38ca7c52594e7de3f2c5" exitCode=0 Feb 16 19:46:42 crc kubenswrapper[4675]: I0216 19:46:42.491574 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-grmd4" event={"ID":"ee9cd13e-3dd5-4fbf-8989-965350dff2e8","Type":"ContainerDied","Data":"877a79e2937424f39a4867f6f4a274bd649ac2396a6e38ca7c52594e7de3f2c5"} Feb 16 19:46:42 crc kubenswrapper[4675]: I0216 19:46:42.503009 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e7e5daa8-b9d8-4f3f-902e-36273ad65acb-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-2pww7\" (UID: \"e7e5daa8-b9d8-4f3f-902e-36273ad65acb\") " pod="openshift-marketplace/marketplace-operator-79b997595-2pww7" Feb 16 19:46:42 crc kubenswrapper[4675]: I0216 19:46:42.503071 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4n9rk\" (UniqueName: \"kubernetes.io/projected/e7e5daa8-b9d8-4f3f-902e-36273ad65acb-kube-api-access-4n9rk\") pod \"marketplace-operator-79b997595-2pww7\" (UID: \"e7e5daa8-b9d8-4f3f-902e-36273ad65acb\") " pod="openshift-marketplace/marketplace-operator-79b997595-2pww7" Feb 16 19:46:42 crc kubenswrapper[4675]: I0216 19:46:42.503088 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e7e5daa8-b9d8-4f3f-902e-36273ad65acb-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-2pww7\" (UID: \"e7e5daa8-b9d8-4f3f-902e-36273ad65acb\") " pod="openshift-marketplace/marketplace-operator-79b997595-2pww7" Feb 16 19:46:42 crc kubenswrapper[4675]: I0216 19:46:42.505294 4675 generic.go:334] "Generic (PLEG): container finished" podID="fecbc375-b333-42c1-bff9-3d17abec2eb2" containerID="3d127274c6289a876eebbf2766615789d6b8503a82480dc02fc5374f7198afa1" exitCode=0 Feb 16 19:46:42 crc kubenswrapper[4675]: I0216 19:46:42.505379 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ndhwt" event={"ID":"fecbc375-b333-42c1-bff9-3d17abec2eb2","Type":"ContainerDied","Data":"3d127274c6289a876eebbf2766615789d6b8503a82480dc02fc5374f7198afa1"} Feb 16 19:46:42 crc kubenswrapper[4675]: I0216 19:46:42.513659 4675 generic.go:334] "Generic (PLEG): container finished" podID="a833f00b-0fcb-416e-be3e-c4344adeef8d" containerID="d49e4d698d7921dd47b5de893a643c1130a48eb65a1d3c363b895c6528033815" exitCode=0 Feb 16 19:46:42 crc kubenswrapper[4675]: I0216 19:46:42.513740 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kfrnk" event={"ID":"a833f00b-0fcb-416e-be3e-c4344adeef8d","Type":"ContainerDied","Data":"d49e4d698d7921dd47b5de893a643c1130a48eb65a1d3c363b895c6528033815"} Feb 16 19:46:42 crc kubenswrapper[4675]: I0216 19:46:42.513830 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e7e5daa8-b9d8-4f3f-902e-36273ad65acb-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-2pww7\" (UID: \"e7e5daa8-b9d8-4f3f-902e-36273ad65acb\") " pod="openshift-marketplace/marketplace-operator-79b997595-2pww7" Feb 16 19:46:42 crc kubenswrapper[4675]: I0216 19:46:42.514966 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e7e5daa8-b9d8-4f3f-902e-36273ad65acb-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-2pww7\" (UID: \"e7e5daa8-b9d8-4f3f-902e-36273ad65acb\") " pod="openshift-marketplace/marketplace-operator-79b997595-2pww7" Feb 16 19:46:42 crc kubenswrapper[4675]: I0216 19:46:42.516664 4675 generic.go:334] "Generic (PLEG): container finished" podID="408894c5-c798-48ff-93ac-bc8ea114ee4a" containerID="fb1bd0c01fe0d208c5c17d908853e15d6ee199b812c327ac4ce94792b244add8" exitCode=0 Feb 16 19:46:42 crc kubenswrapper[4675]: I0216 19:46:42.516725 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6kj4n" event={"ID":"408894c5-c798-48ff-93ac-bc8ea114ee4a","Type":"ContainerDied","Data":"fb1bd0c01fe0d208c5c17d908853e15d6ee199b812c327ac4ce94792b244add8"} Feb 16 19:46:42 crc kubenswrapper[4675]: I0216 19:46:42.524016 4675 generic.go:334] "Generic (PLEG): container finished" podID="eb2e7e26-8ed8-4cec-8f78-7d02cce6bb33" containerID="144ee6fd7a97b4a71b0833920c5a41a5c5d867dcfb40cb217f8fb029acc45cab" exitCode=0 Feb 16 19:46:42 crc kubenswrapper[4675]: I0216 19:46:42.524072 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h6msp" event={"ID":"eb2e7e26-8ed8-4cec-8f78-7d02cce6bb33","Type":"ContainerDied","Data":"144ee6fd7a97b4a71b0833920c5a41a5c5d867dcfb40cb217f8fb029acc45cab"} Feb 16 19:46:42 crc kubenswrapper[4675]: I0216 19:46:42.524585 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4n9rk\" (UniqueName: \"kubernetes.io/projected/e7e5daa8-b9d8-4f3f-902e-36273ad65acb-kube-api-access-4n9rk\") pod \"marketplace-operator-79b997595-2pww7\" (UID: \"e7e5daa8-b9d8-4f3f-902e-36273ad65acb\") " pod="openshift-marketplace/marketplace-operator-79b997595-2pww7" Feb 16 19:46:42 crc kubenswrapper[4675]: I0216 19:46:42.775906 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-2pww7" Feb 16 19:46:42 crc kubenswrapper[4675]: I0216 19:46:42.780940 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-grmd4" Feb 16 19:46:42 crc kubenswrapper[4675]: I0216 19:46:42.787780 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-6kj4n" Feb 16 19:46:42 crc kubenswrapper[4675]: I0216 19:46:42.795164 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ndhwt" Feb 16 19:46:42 crc kubenswrapper[4675]: I0216 19:46:42.813804 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kfrnk" Feb 16 19:46:42 crc kubenswrapper[4675]: I0216 19:46:42.815016 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h6msp" Feb 16 19:46:42 crc kubenswrapper[4675]: I0216 19:46:42.914567 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb2e7e26-8ed8-4cec-8f78-7d02cce6bb33-utilities\") pod \"eb2e7e26-8ed8-4cec-8f78-7d02cce6bb33\" (UID: \"eb2e7e26-8ed8-4cec-8f78-7d02cce6bb33\") " Feb 16 19:46:42 crc kubenswrapper[4675]: I0216 19:46:42.914643 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/408894c5-c798-48ff-93ac-bc8ea114ee4a-marketplace-operator-metrics\") pod \"408894c5-c798-48ff-93ac-bc8ea114ee4a\" (UID: \"408894c5-c798-48ff-93ac-bc8ea114ee4a\") " Feb 16 19:46:42 crc kubenswrapper[4675]: I0216 19:46:42.914709 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb2e7e26-8ed8-4cec-8f78-7d02cce6bb33-catalog-content\") pod \"eb2e7e26-8ed8-4cec-8f78-7d02cce6bb33\" (UID: \"eb2e7e26-8ed8-4cec-8f78-7d02cce6bb33\") " Feb 16 19:46:42 crc kubenswrapper[4675]: I0216 19:46:42.914750 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee9cd13e-3dd5-4fbf-8989-965350dff2e8-utilities\") pod \"ee9cd13e-3dd5-4fbf-8989-965350dff2e8\" (UID: \"ee9cd13e-3dd5-4fbf-8989-965350dff2e8\") " Feb 16 19:46:42 crc kubenswrapper[4675]: I0216 19:46:42.914796 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a833f00b-0fcb-416e-be3e-c4344adeef8d-catalog-content\") pod \"a833f00b-0fcb-416e-be3e-c4344adeef8d\" (UID: \"a833f00b-0fcb-416e-be3e-c4344adeef8d\") " Feb 16 19:46:42 crc kubenswrapper[4675]: I0216 19:46:42.914828 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xtpz\" (UniqueName: \"kubernetes.io/projected/408894c5-c798-48ff-93ac-bc8ea114ee4a-kube-api-access-4xtpz\") pod \"408894c5-c798-48ff-93ac-bc8ea114ee4a\" (UID: \"408894c5-c798-48ff-93ac-bc8ea114ee4a\") " Feb 16 19:46:42 crc kubenswrapper[4675]: I0216 19:46:42.914909 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x5q59\" (UniqueName: \"kubernetes.io/projected/eb2e7e26-8ed8-4cec-8f78-7d02cce6bb33-kube-api-access-x5q59\") pod \"eb2e7e26-8ed8-4cec-8f78-7d02cce6bb33\" (UID: \"eb2e7e26-8ed8-4cec-8f78-7d02cce6bb33\") " Feb 16 19:46:42 crc kubenswrapper[4675]: I0216 19:46:42.914997 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee9cd13e-3dd5-4fbf-8989-965350dff2e8-catalog-content\") pod \"ee9cd13e-3dd5-4fbf-8989-965350dff2e8\" (UID: \"ee9cd13e-3dd5-4fbf-8989-965350dff2e8\") " Feb 16 19:46:42 crc kubenswrapper[4675]: I0216 19:46:42.915047 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2lsl\" (UniqueName: \"kubernetes.io/projected/a833f00b-0fcb-416e-be3e-c4344adeef8d-kube-api-access-x2lsl\") pod \"a833f00b-0fcb-416e-be3e-c4344adeef8d\" (UID: \"a833f00b-0fcb-416e-be3e-c4344adeef8d\") " Feb 16 19:46:42 crc kubenswrapper[4675]: I0216 19:46:42.915091 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fecbc375-b333-42c1-bff9-3d17abec2eb2-utilities\") pod \"fecbc375-b333-42c1-bff9-3d17abec2eb2\" (UID: \"fecbc375-b333-42c1-bff9-3d17abec2eb2\") " Feb 16 19:46:42 crc kubenswrapper[4675]: I0216 19:46:42.915145 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a833f00b-0fcb-416e-be3e-c4344adeef8d-utilities\") pod \"a833f00b-0fcb-416e-be3e-c4344adeef8d\" (UID: \"a833f00b-0fcb-416e-be3e-c4344adeef8d\") " Feb 16 19:46:42 crc kubenswrapper[4675]: I0216 19:46:42.915179 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j8q2r\" (UniqueName: \"kubernetes.io/projected/fecbc375-b333-42c1-bff9-3d17abec2eb2-kube-api-access-j8q2r\") pod \"fecbc375-b333-42c1-bff9-3d17abec2eb2\" (UID: \"fecbc375-b333-42c1-bff9-3d17abec2eb2\") " Feb 16 19:46:42 crc kubenswrapper[4675]: I0216 19:46:42.915216 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57kvs\" (UniqueName: \"kubernetes.io/projected/ee9cd13e-3dd5-4fbf-8989-965350dff2e8-kube-api-access-57kvs\") pod \"ee9cd13e-3dd5-4fbf-8989-965350dff2e8\" (UID: \"ee9cd13e-3dd5-4fbf-8989-965350dff2e8\") " Feb 16 19:46:42 crc kubenswrapper[4675]: I0216 19:46:42.915258 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/408894c5-c798-48ff-93ac-bc8ea114ee4a-marketplace-trusted-ca\") pod \"408894c5-c798-48ff-93ac-bc8ea114ee4a\" (UID: \"408894c5-c798-48ff-93ac-bc8ea114ee4a\") " Feb 16 19:46:42 crc kubenswrapper[4675]: I0216 19:46:42.915291 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fecbc375-b333-42c1-bff9-3d17abec2eb2-catalog-content\") pod \"fecbc375-b333-42c1-bff9-3d17abec2eb2\" (UID: \"fecbc375-b333-42c1-bff9-3d17abec2eb2\") " Feb 16 19:46:42 crc kubenswrapper[4675]: I0216 19:46:42.919802 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb2e7e26-8ed8-4cec-8f78-7d02cce6bb33-utilities" (OuterVolumeSpecName: "utilities") pod "eb2e7e26-8ed8-4cec-8f78-7d02cce6bb33" (UID: "eb2e7e26-8ed8-4cec-8f78-7d02cce6bb33"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 19:46:42 crc kubenswrapper[4675]: I0216 19:46:42.920769 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee9cd13e-3dd5-4fbf-8989-965350dff2e8-utilities" (OuterVolumeSpecName: "utilities") pod "ee9cd13e-3dd5-4fbf-8989-965350dff2e8" (UID: "ee9cd13e-3dd5-4fbf-8989-965350dff2e8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 19:46:42 crc kubenswrapper[4675]: I0216 19:46:42.921490 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a833f00b-0fcb-416e-be3e-c4344adeef8d-utilities" (OuterVolumeSpecName: "utilities") pod "a833f00b-0fcb-416e-be3e-c4344adeef8d" (UID: "a833f00b-0fcb-416e-be3e-c4344adeef8d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 19:46:42 crc kubenswrapper[4675]: I0216 19:46:42.922855 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fecbc375-b333-42c1-bff9-3d17abec2eb2-utilities" (OuterVolumeSpecName: "utilities") pod "fecbc375-b333-42c1-bff9-3d17abec2eb2" (UID: "fecbc375-b333-42c1-bff9-3d17abec2eb2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 19:46:42 crc kubenswrapper[4675]: I0216 19:46:42.923863 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/408894c5-c798-48ff-93ac-bc8ea114ee4a-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "408894c5-c798-48ff-93ac-bc8ea114ee4a" (UID: "408894c5-c798-48ff-93ac-bc8ea114ee4a"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 19:46:42 crc kubenswrapper[4675]: I0216 19:46:42.941498 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb2e7e26-8ed8-4cec-8f78-7d02cce6bb33-kube-api-access-x5q59" (OuterVolumeSpecName: "kube-api-access-x5q59") pod "eb2e7e26-8ed8-4cec-8f78-7d02cce6bb33" (UID: "eb2e7e26-8ed8-4cec-8f78-7d02cce6bb33"). InnerVolumeSpecName "kube-api-access-x5q59". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 19:46:42 crc kubenswrapper[4675]: I0216 19:46:42.942044 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/408894c5-c798-48ff-93ac-bc8ea114ee4a-kube-api-access-4xtpz" (OuterVolumeSpecName: "kube-api-access-4xtpz") pod "408894c5-c798-48ff-93ac-bc8ea114ee4a" (UID: "408894c5-c798-48ff-93ac-bc8ea114ee4a"). InnerVolumeSpecName "kube-api-access-4xtpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 19:46:42 crc kubenswrapper[4675]: I0216 19:46:42.942224 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee9cd13e-3dd5-4fbf-8989-965350dff2e8-kube-api-access-57kvs" (OuterVolumeSpecName: "kube-api-access-57kvs") pod "ee9cd13e-3dd5-4fbf-8989-965350dff2e8" (UID: "ee9cd13e-3dd5-4fbf-8989-965350dff2e8"). InnerVolumeSpecName "kube-api-access-57kvs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 19:46:42 crc kubenswrapper[4675]: I0216 19:46:42.942262 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fecbc375-b333-42c1-bff9-3d17abec2eb2-kube-api-access-j8q2r" (OuterVolumeSpecName: "kube-api-access-j8q2r") pod "fecbc375-b333-42c1-bff9-3d17abec2eb2" (UID: "fecbc375-b333-42c1-bff9-3d17abec2eb2"). InnerVolumeSpecName "kube-api-access-j8q2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 19:46:42 crc kubenswrapper[4675]: I0216 19:46:42.942480 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/408894c5-c798-48ff-93ac-bc8ea114ee4a-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "408894c5-c798-48ff-93ac-bc8ea114ee4a" (UID: "408894c5-c798-48ff-93ac-bc8ea114ee4a"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 19:46:42 crc kubenswrapper[4675]: I0216 19:46:42.945900 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a833f00b-0fcb-416e-be3e-c4344adeef8d-kube-api-access-x2lsl" (OuterVolumeSpecName: "kube-api-access-x2lsl") pod "a833f00b-0fcb-416e-be3e-c4344adeef8d" (UID: "a833f00b-0fcb-416e-be3e-c4344adeef8d"). InnerVolumeSpecName "kube-api-access-x2lsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 19:46:42 crc kubenswrapper[4675]: I0216 19:46:42.988103 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fecbc375-b333-42c1-bff9-3d17abec2eb2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fecbc375-b333-42c1-bff9-3d17abec2eb2" (UID: "fecbc375-b333-42c1-bff9-3d17abec2eb2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 19:46:43 crc kubenswrapper[4675]: I0216 19:46:43.017736 4675 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb2e7e26-8ed8-4cec-8f78-7d02cce6bb33-utilities\") on node \"crc\" DevicePath \"\"" Feb 16 19:46:43 crc kubenswrapper[4675]: I0216 19:46:43.018163 4675 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/408894c5-c798-48ff-93ac-bc8ea114ee4a-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 16 19:46:43 crc kubenswrapper[4675]: I0216 19:46:43.018183 4675 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee9cd13e-3dd5-4fbf-8989-965350dff2e8-utilities\") on node \"crc\" DevicePath \"\"" Feb 16 19:46:43 crc kubenswrapper[4675]: I0216 19:46:43.018194 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xtpz\" (UniqueName: \"kubernetes.io/projected/408894c5-c798-48ff-93ac-bc8ea114ee4a-kube-api-access-4xtpz\") on node \"crc\" DevicePath \"\"" Feb 16 19:46:43 crc kubenswrapper[4675]: I0216 19:46:43.018206 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x5q59\" (UniqueName: \"kubernetes.io/projected/eb2e7e26-8ed8-4cec-8f78-7d02cce6bb33-kube-api-access-x5q59\") on node \"crc\" DevicePath \"\"" Feb 16 19:46:43 crc kubenswrapper[4675]: I0216 19:46:43.018219 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2lsl\" (UniqueName: \"kubernetes.io/projected/a833f00b-0fcb-416e-be3e-c4344adeef8d-kube-api-access-x2lsl\") on node \"crc\" DevicePath \"\"" Feb 16 19:46:43 crc kubenswrapper[4675]: I0216 19:46:43.018231 4675 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fecbc375-b333-42c1-bff9-3d17abec2eb2-utilities\") on node \"crc\" DevicePath \"\"" Feb 16 19:46:43 crc kubenswrapper[4675]: I0216 19:46:43.018243 4675 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a833f00b-0fcb-416e-be3e-c4344adeef8d-utilities\") on node \"crc\" DevicePath \"\"" Feb 16 19:46:43 crc kubenswrapper[4675]: I0216 19:46:43.018256 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j8q2r\" (UniqueName: \"kubernetes.io/projected/fecbc375-b333-42c1-bff9-3d17abec2eb2-kube-api-access-j8q2r\") on node \"crc\" DevicePath \"\"" Feb 16 19:46:43 crc kubenswrapper[4675]: I0216 19:46:43.018265 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-57kvs\" (UniqueName: \"kubernetes.io/projected/ee9cd13e-3dd5-4fbf-8989-965350dff2e8-kube-api-access-57kvs\") on node \"crc\" DevicePath \"\"" Feb 16 19:46:43 crc kubenswrapper[4675]: I0216 19:46:43.018274 4675 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/408894c5-c798-48ff-93ac-bc8ea114ee4a-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 16 19:46:43 crc kubenswrapper[4675]: I0216 19:46:43.018282 4675 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fecbc375-b333-42c1-bff9-3d17abec2eb2-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 16 19:46:43 crc kubenswrapper[4675]: I0216 19:46:43.020542 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb2e7e26-8ed8-4cec-8f78-7d02cce6bb33-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "eb2e7e26-8ed8-4cec-8f78-7d02cce6bb33" (UID: "eb2e7e26-8ed8-4cec-8f78-7d02cce6bb33"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 19:46:43 crc kubenswrapper[4675]: I0216 19:46:43.026507 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2pww7"] Feb 16 19:46:43 crc kubenswrapper[4675]: I0216 19:46:43.027228 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee9cd13e-3dd5-4fbf-8989-965350dff2e8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ee9cd13e-3dd5-4fbf-8989-965350dff2e8" (UID: "ee9cd13e-3dd5-4fbf-8989-965350dff2e8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 19:46:43 crc kubenswrapper[4675]: I0216 19:46:43.069482 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a833f00b-0fcb-416e-be3e-c4344adeef8d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a833f00b-0fcb-416e-be3e-c4344adeef8d" (UID: "a833f00b-0fcb-416e-be3e-c4344adeef8d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 19:46:43 crc kubenswrapper[4675]: I0216 19:46:43.119795 4675 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb2e7e26-8ed8-4cec-8f78-7d02cce6bb33-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 16 19:46:43 crc kubenswrapper[4675]: I0216 19:46:43.119845 4675 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a833f00b-0fcb-416e-be3e-c4344adeef8d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 16 19:46:43 crc kubenswrapper[4675]: I0216 19:46:43.119858 4675 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee9cd13e-3dd5-4fbf-8989-965350dff2e8-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 16 19:46:43 crc kubenswrapper[4675]: I0216 19:46:43.532039 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ndhwt" event={"ID":"fecbc375-b333-42c1-bff9-3d17abec2eb2","Type":"ContainerDied","Data":"20eb1ad8332fe7b0c9da19df0b9721d3d48649ebcded7ef29598d52f33b3f041"} Feb 16 19:46:43 crc kubenswrapper[4675]: I0216 19:46:43.532096 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ndhwt" Feb 16 19:46:43 crc kubenswrapper[4675]: I0216 19:46:43.532136 4675 scope.go:117] "RemoveContainer" containerID="3d127274c6289a876eebbf2766615789d6b8503a82480dc02fc5374f7198afa1" Feb 16 19:46:43 crc kubenswrapper[4675]: I0216 19:46:43.534529 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6kj4n" event={"ID":"408894c5-c798-48ff-93ac-bc8ea114ee4a","Type":"ContainerDied","Data":"87f352d61dab8b390f045f5835b03edb64f2e820328abd145ad4f940204f38a5"} Feb 16 19:46:43 crc kubenswrapper[4675]: I0216 19:46:43.534603 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-6kj4n" Feb 16 19:46:43 crc kubenswrapper[4675]: I0216 19:46:43.537366 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h6msp" event={"ID":"eb2e7e26-8ed8-4cec-8f78-7d02cce6bb33","Type":"ContainerDied","Data":"e85cc4ea14490b679737814c0df7875e00d5c61d44988dec7dceec4ae91ba873"} Feb 16 19:46:43 crc kubenswrapper[4675]: I0216 19:46:43.537414 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h6msp" Feb 16 19:46:43 crc kubenswrapper[4675]: I0216 19:46:43.540595 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-grmd4" event={"ID":"ee9cd13e-3dd5-4fbf-8989-965350dff2e8","Type":"ContainerDied","Data":"04d0e7b020fbbcb0085bf9028d67c2ff0e702b553087ffd5c58044ee91e18b30"} Feb 16 19:46:43 crc kubenswrapper[4675]: I0216 19:46:43.540648 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-grmd4" Feb 16 19:46:43 crc kubenswrapper[4675]: I0216 19:46:43.558094 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2pww7" event={"ID":"e7e5daa8-b9d8-4f3f-902e-36273ad65acb","Type":"ContainerStarted","Data":"db190312e313dd7f4a888d36e091f89ba56274e9fcc8443ee3dae1fd7caa5180"} Feb 16 19:46:43 crc kubenswrapper[4675]: I0216 19:46:43.558151 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2pww7" event={"ID":"e7e5daa8-b9d8-4f3f-902e-36273ad65acb","Type":"ContainerStarted","Data":"ec5099b8e61d15246dc338a821af2c3313cd2c6f8f53911e45d79cf4a10cd80f"} Feb 16 19:46:43 crc kubenswrapper[4675]: I0216 19:46:43.558628 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-2pww7" Feb 16 19:46:43 crc kubenswrapper[4675]: I0216 19:46:43.565716 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kfrnk" event={"ID":"a833f00b-0fcb-416e-be3e-c4344adeef8d","Type":"ContainerDied","Data":"5159449c1af4355c29d8f6617020fb2ed75eb703b79b15b82648edfe721eb4a6"} Feb 16 19:46:43 crc kubenswrapper[4675]: I0216 19:46:43.565876 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kfrnk" Feb 16 19:46:43 crc kubenswrapper[4675]: I0216 19:46:43.565565 4675 scope.go:117] "RemoveContainer" containerID="8ceb1340194ea6c535d30d3a8e72737b9761d312c8e99f72a3c8e8e9edf5e015" Feb 16 19:46:43 crc kubenswrapper[4675]: I0216 19:46:43.582092 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-2pww7" Feb 16 19:46:43 crc kubenswrapper[4675]: I0216 19:46:43.582779 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-2pww7" podStartSLOduration=1.582755044 podStartE2EDuration="1.582755044s" podCreationTimestamp="2026-02-16 19:46:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 19:46:43.582444035 +0000 UTC m=+286.707733591" watchObservedRunningTime="2026-02-16 19:46:43.582755044 +0000 UTC m=+286.708044590" Feb 16 19:46:43 crc kubenswrapper[4675]: I0216 19:46:43.598151 4675 scope.go:117] "RemoveContainer" containerID="558bf60a74aeb3696d678168de44dd06804866c4fabf9f88a8196ddcbba2f3ba" Feb 16 19:46:43 crc kubenswrapper[4675]: I0216 19:46:43.624197 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6kj4n"] Feb 16 19:46:43 crc kubenswrapper[4675]: I0216 19:46:43.633602 4675 scope.go:117] "RemoveContainer" containerID="fb1bd0c01fe0d208c5c17d908853e15d6ee199b812c327ac4ce94792b244add8" Feb 16 19:46:43 crc kubenswrapper[4675]: I0216 19:46:43.647444 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6kj4n"] Feb 16 19:46:43 crc kubenswrapper[4675]: I0216 19:46:43.657130 4675 scope.go:117] "RemoveContainer" containerID="144ee6fd7a97b4a71b0833920c5a41a5c5d867dcfb40cb217f8fb029acc45cab" Feb 16 19:46:43 crc kubenswrapper[4675]: I0216 19:46:43.662332 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-grmd4"] Feb 16 19:46:43 crc kubenswrapper[4675]: I0216 19:46:43.667777 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-grmd4"] Feb 16 19:46:43 crc kubenswrapper[4675]: I0216 19:46:43.670841 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ndhwt"] Feb 16 19:46:43 crc kubenswrapper[4675]: I0216 19:46:43.675062 4675 scope.go:117] "RemoveContainer" containerID="71838788e4e8f4cd2516ebd1f6177d4353440a42c2162854b2540ebf51eeb1b2" Feb 16 19:46:43 crc kubenswrapper[4675]: I0216 19:46:43.675231 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ndhwt"] Feb 16 19:46:43 crc kubenswrapper[4675]: I0216 19:46:43.678776 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kfrnk"] Feb 16 19:46:43 crc kubenswrapper[4675]: I0216 19:46:43.681184 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-kfrnk"] Feb 16 19:46:43 crc kubenswrapper[4675]: I0216 19:46:43.690729 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-h6msp"] Feb 16 19:46:43 crc kubenswrapper[4675]: I0216 19:46:43.694343 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-h6msp"] Feb 16 19:46:43 crc kubenswrapper[4675]: I0216 19:46:43.716043 4675 scope.go:117] "RemoveContainer" containerID="a51b44a8dd25c0d3cc67de94ee4f65f34d9c3b7027804dcdd04a0b205a346726" Feb 16 19:46:43 crc kubenswrapper[4675]: I0216 19:46:43.736496 4675 scope.go:117] "RemoveContainer" containerID="877a79e2937424f39a4867f6f4a274bd649ac2396a6e38ca7c52594e7de3f2c5" Feb 16 19:46:43 crc kubenswrapper[4675]: I0216 19:46:43.753608 4675 scope.go:117] "RemoveContainer" containerID="2107c3931e2a211169fc7be814dd3700e16689ca03a14512561e275df538ea93" Feb 16 19:46:43 crc kubenswrapper[4675]: I0216 19:46:43.775872 4675 scope.go:117] "RemoveContainer" containerID="142797d8f95b84afe1930cda9106e6e018bb86ca29b8f8d1e365525a90c15786" Feb 16 19:46:43 crc kubenswrapper[4675]: I0216 19:46:43.791347 4675 scope.go:117] "RemoveContainer" containerID="d49e4d698d7921dd47b5de893a643c1130a48eb65a1d3c363b895c6528033815" Feb 16 19:46:43 crc kubenswrapper[4675]: I0216 19:46:43.805065 4675 scope.go:117] "RemoveContainer" containerID="555ca99dd51df738bc403af046e1a4cfeabd5180e6ffa43b503b78207bba9337" Feb 16 19:46:43 crc kubenswrapper[4675]: I0216 19:46:43.822090 4675 scope.go:117] "RemoveContainer" containerID="2fdce9395e32381374bb162dc3e507917d276a2bd8696cfea1835a27ace6f7ea" Feb 16 19:46:43 crc kubenswrapper[4675]: I0216 19:46:43.896258 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="408894c5-c798-48ff-93ac-bc8ea114ee4a" path="/var/lib/kubelet/pods/408894c5-c798-48ff-93ac-bc8ea114ee4a/volumes" Feb 16 19:46:43 crc kubenswrapper[4675]: I0216 19:46:43.897473 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a833f00b-0fcb-416e-be3e-c4344adeef8d" path="/var/lib/kubelet/pods/a833f00b-0fcb-416e-be3e-c4344adeef8d/volumes" Feb 16 19:46:43 crc kubenswrapper[4675]: I0216 19:46:43.898371 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb2e7e26-8ed8-4cec-8f78-7d02cce6bb33" path="/var/lib/kubelet/pods/eb2e7e26-8ed8-4cec-8f78-7d02cce6bb33/volumes" Feb 16 19:46:43 crc kubenswrapper[4675]: I0216 19:46:43.899924 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee9cd13e-3dd5-4fbf-8989-965350dff2e8" path="/var/lib/kubelet/pods/ee9cd13e-3dd5-4fbf-8989-965350dff2e8/volumes" Feb 16 19:46:43 crc kubenswrapper[4675]: I0216 19:46:43.900856 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fecbc375-b333-42c1-bff9-3d17abec2eb2" path="/var/lib/kubelet/pods/fecbc375-b333-42c1-bff9-3d17abec2eb2/volumes" Feb 16 19:46:57 crc kubenswrapper[4675]: I0216 19:46:57.592115 4675 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Feb 16 19:46:59 crc kubenswrapper[4675]: I0216 19:46:59.787213 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 16 19:47:10 crc kubenswrapper[4675]: I0216 19:47:10.701852 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-qzcjb"] Feb 16 19:47:10 crc kubenswrapper[4675]: I0216 19:47:10.702806 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-qzcjb" podUID="d9de00f8-6995-42d6-ad28-1961096e55c0" containerName="controller-manager" containerID="cri-o://fc51b6e99af5497842f7ecf7064e700c700e56f017af21b26577194c6965e3d9" gracePeriod=30 Feb 16 19:47:10 crc kubenswrapper[4675]: I0216 19:47:10.814902 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-q894z"] Feb 16 19:47:10 crc kubenswrapper[4675]: I0216 19:47:10.815635 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q894z" podUID="f80b9c97-1cb5-427f-9b9f-4b62316d9e7a" containerName="route-controller-manager" containerID="cri-o://023a978df3fd78c68a34a1312e8a2956396c8f897d22084c0ed520f1cc7a8b4c" gracePeriod=30 Feb 16 19:47:11 crc kubenswrapper[4675]: I0216 19:47:11.135608 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-qzcjb" Feb 16 19:47:11 crc kubenswrapper[4675]: I0216 19:47:11.160854 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n6rg4\" (UniqueName: \"kubernetes.io/projected/d9de00f8-6995-42d6-ad28-1961096e55c0-kube-api-access-n6rg4\") pod \"d9de00f8-6995-42d6-ad28-1961096e55c0\" (UID: \"d9de00f8-6995-42d6-ad28-1961096e55c0\") " Feb 16 19:47:11 crc kubenswrapper[4675]: I0216 19:47:11.160994 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d9de00f8-6995-42d6-ad28-1961096e55c0-proxy-ca-bundles\") pod \"d9de00f8-6995-42d6-ad28-1961096e55c0\" (UID: \"d9de00f8-6995-42d6-ad28-1961096e55c0\") " Feb 16 19:47:11 crc kubenswrapper[4675]: I0216 19:47:11.161056 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d9de00f8-6995-42d6-ad28-1961096e55c0-serving-cert\") pod \"d9de00f8-6995-42d6-ad28-1961096e55c0\" (UID: \"d9de00f8-6995-42d6-ad28-1961096e55c0\") " Feb 16 19:47:11 crc kubenswrapper[4675]: I0216 19:47:11.161134 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d9de00f8-6995-42d6-ad28-1961096e55c0-client-ca\") pod \"d9de00f8-6995-42d6-ad28-1961096e55c0\" (UID: \"d9de00f8-6995-42d6-ad28-1961096e55c0\") " Feb 16 19:47:11 crc kubenswrapper[4675]: I0216 19:47:11.161167 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9de00f8-6995-42d6-ad28-1961096e55c0-config\") pod \"d9de00f8-6995-42d6-ad28-1961096e55c0\" (UID: \"d9de00f8-6995-42d6-ad28-1961096e55c0\") " Feb 16 19:47:11 crc kubenswrapper[4675]: I0216 19:47:11.162739 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9de00f8-6995-42d6-ad28-1961096e55c0-config" (OuterVolumeSpecName: "config") pod "d9de00f8-6995-42d6-ad28-1961096e55c0" (UID: "d9de00f8-6995-42d6-ad28-1961096e55c0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 19:47:11 crc kubenswrapper[4675]: I0216 19:47:11.162715 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9de00f8-6995-42d6-ad28-1961096e55c0-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "d9de00f8-6995-42d6-ad28-1961096e55c0" (UID: "d9de00f8-6995-42d6-ad28-1961096e55c0"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 19:47:11 crc kubenswrapper[4675]: I0216 19:47:11.163028 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9de00f8-6995-42d6-ad28-1961096e55c0-client-ca" (OuterVolumeSpecName: "client-ca") pod "d9de00f8-6995-42d6-ad28-1961096e55c0" (UID: "d9de00f8-6995-42d6-ad28-1961096e55c0"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 19:47:11 crc kubenswrapper[4675]: I0216 19:47:11.169823 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9de00f8-6995-42d6-ad28-1961096e55c0-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d9de00f8-6995-42d6-ad28-1961096e55c0" (UID: "d9de00f8-6995-42d6-ad28-1961096e55c0"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 19:47:11 crc kubenswrapper[4675]: I0216 19:47:11.174457 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9de00f8-6995-42d6-ad28-1961096e55c0-kube-api-access-n6rg4" (OuterVolumeSpecName: "kube-api-access-n6rg4") pod "d9de00f8-6995-42d6-ad28-1961096e55c0" (UID: "d9de00f8-6995-42d6-ad28-1961096e55c0"). InnerVolumeSpecName "kube-api-access-n6rg4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 19:47:11 crc kubenswrapper[4675]: I0216 19:47:11.204315 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q894z" Feb 16 19:47:11 crc kubenswrapper[4675]: I0216 19:47:11.262205 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f80b9c97-1cb5-427f-9b9f-4b62316d9e7a-client-ca\") pod \"f80b9c97-1cb5-427f-9b9f-4b62316d9e7a\" (UID: \"f80b9c97-1cb5-427f-9b9f-4b62316d9e7a\") " Feb 16 19:47:11 crc kubenswrapper[4675]: I0216 19:47:11.262382 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vnnlq\" (UniqueName: \"kubernetes.io/projected/f80b9c97-1cb5-427f-9b9f-4b62316d9e7a-kube-api-access-vnnlq\") pod \"f80b9c97-1cb5-427f-9b9f-4b62316d9e7a\" (UID: \"f80b9c97-1cb5-427f-9b9f-4b62316d9e7a\") " Feb 16 19:47:11 crc kubenswrapper[4675]: I0216 19:47:11.262482 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f80b9c97-1cb5-427f-9b9f-4b62316d9e7a-serving-cert\") pod \"f80b9c97-1cb5-427f-9b9f-4b62316d9e7a\" (UID: \"f80b9c97-1cb5-427f-9b9f-4b62316d9e7a\") " Feb 16 19:47:11 crc kubenswrapper[4675]: I0216 19:47:11.262568 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f80b9c97-1cb5-427f-9b9f-4b62316d9e7a-config\") pod \"f80b9c97-1cb5-427f-9b9f-4b62316d9e7a\" (UID: \"f80b9c97-1cb5-427f-9b9f-4b62316d9e7a\") " Feb 16 19:47:11 crc kubenswrapper[4675]: I0216 19:47:11.262996 4675 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d9de00f8-6995-42d6-ad28-1961096e55c0-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 16 19:47:11 crc kubenswrapper[4675]: I0216 19:47:11.263035 4675 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d9de00f8-6995-42d6-ad28-1961096e55c0-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 16 19:47:11 crc kubenswrapper[4675]: I0216 19:47:11.263057 4675 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d9de00f8-6995-42d6-ad28-1961096e55c0-client-ca\") on node \"crc\" DevicePath \"\"" Feb 16 19:47:11 crc kubenswrapper[4675]: I0216 19:47:11.263078 4675 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9de00f8-6995-42d6-ad28-1961096e55c0-config\") on node \"crc\" DevicePath \"\"" Feb 16 19:47:11 crc kubenswrapper[4675]: I0216 19:47:11.263097 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n6rg4\" (UniqueName: \"kubernetes.io/projected/d9de00f8-6995-42d6-ad28-1961096e55c0-kube-api-access-n6rg4\") on node \"crc\" DevicePath \"\"" Feb 16 19:47:11 crc kubenswrapper[4675]: I0216 19:47:11.263812 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f80b9c97-1cb5-427f-9b9f-4b62316d9e7a-config" (OuterVolumeSpecName: "config") pod "f80b9c97-1cb5-427f-9b9f-4b62316d9e7a" (UID: "f80b9c97-1cb5-427f-9b9f-4b62316d9e7a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 19:47:11 crc kubenswrapper[4675]: I0216 19:47:11.263887 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f80b9c97-1cb5-427f-9b9f-4b62316d9e7a-client-ca" (OuterVolumeSpecName: "client-ca") pod "f80b9c97-1cb5-427f-9b9f-4b62316d9e7a" (UID: "f80b9c97-1cb5-427f-9b9f-4b62316d9e7a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 19:47:11 crc kubenswrapper[4675]: I0216 19:47:11.266133 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f80b9c97-1cb5-427f-9b9f-4b62316d9e7a-kube-api-access-vnnlq" (OuterVolumeSpecName: "kube-api-access-vnnlq") pod "f80b9c97-1cb5-427f-9b9f-4b62316d9e7a" (UID: "f80b9c97-1cb5-427f-9b9f-4b62316d9e7a"). InnerVolumeSpecName "kube-api-access-vnnlq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 19:47:11 crc kubenswrapper[4675]: I0216 19:47:11.266277 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f80b9c97-1cb5-427f-9b9f-4b62316d9e7a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f80b9c97-1cb5-427f-9b9f-4b62316d9e7a" (UID: "f80b9c97-1cb5-427f-9b9f-4b62316d9e7a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 19:47:11 crc kubenswrapper[4675]: I0216 19:47:11.363864 4675 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f80b9c97-1cb5-427f-9b9f-4b62316d9e7a-config\") on node \"crc\" DevicePath \"\"" Feb 16 19:47:11 crc kubenswrapper[4675]: I0216 19:47:11.363896 4675 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f80b9c97-1cb5-427f-9b9f-4b62316d9e7a-client-ca\") on node \"crc\" DevicePath \"\"" Feb 16 19:47:11 crc kubenswrapper[4675]: I0216 19:47:11.363908 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vnnlq\" (UniqueName: \"kubernetes.io/projected/f80b9c97-1cb5-427f-9b9f-4b62316d9e7a-kube-api-access-vnnlq\") on node \"crc\" DevicePath \"\"" Feb 16 19:47:11 crc kubenswrapper[4675]: I0216 19:47:11.363922 4675 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f80b9c97-1cb5-427f-9b9f-4b62316d9e7a-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 16 19:47:11 crc kubenswrapper[4675]: I0216 19:47:11.797543 4675 generic.go:334] "Generic (PLEG): container finished" podID="f80b9c97-1cb5-427f-9b9f-4b62316d9e7a" containerID="023a978df3fd78c68a34a1312e8a2956396c8f897d22084c0ed520f1cc7a8b4c" exitCode=0 Feb 16 19:47:11 crc kubenswrapper[4675]: I0216 19:47:11.797620 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q894z" Feb 16 19:47:11 crc kubenswrapper[4675]: I0216 19:47:11.797649 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q894z" event={"ID":"f80b9c97-1cb5-427f-9b9f-4b62316d9e7a","Type":"ContainerDied","Data":"023a978df3fd78c68a34a1312e8a2956396c8f897d22084c0ed520f1cc7a8b4c"} Feb 16 19:47:11 crc kubenswrapper[4675]: I0216 19:47:11.797717 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q894z" event={"ID":"f80b9c97-1cb5-427f-9b9f-4b62316d9e7a","Type":"ContainerDied","Data":"eee48e579855f513be972a8235666d134e9b7a1feca7173d5fa3b7b923d0f79e"} Feb 16 19:47:11 crc kubenswrapper[4675]: I0216 19:47:11.797738 4675 scope.go:117] "RemoveContainer" containerID="023a978df3fd78c68a34a1312e8a2956396c8f897d22084c0ed520f1cc7a8b4c" Feb 16 19:47:11 crc kubenswrapper[4675]: I0216 19:47:11.800198 4675 generic.go:334] "Generic (PLEG): container finished" podID="d9de00f8-6995-42d6-ad28-1961096e55c0" containerID="fc51b6e99af5497842f7ecf7064e700c700e56f017af21b26577194c6965e3d9" exitCode=0 Feb 16 19:47:11 crc kubenswrapper[4675]: I0216 19:47:11.800299 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-qzcjb" Feb 16 19:47:11 crc kubenswrapper[4675]: I0216 19:47:11.800291 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-qzcjb" event={"ID":"d9de00f8-6995-42d6-ad28-1961096e55c0","Type":"ContainerDied","Data":"fc51b6e99af5497842f7ecf7064e700c700e56f017af21b26577194c6965e3d9"} Feb 16 19:47:11 crc kubenswrapper[4675]: I0216 19:47:11.800391 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-qzcjb" event={"ID":"d9de00f8-6995-42d6-ad28-1961096e55c0","Type":"ContainerDied","Data":"42be020f7cf6b1831d63f4be1772397f61d480b88c20294d7a73905ea93523e8"} Feb 16 19:47:11 crc kubenswrapper[4675]: I0216 19:47:11.824047 4675 scope.go:117] "RemoveContainer" containerID="023a978df3fd78c68a34a1312e8a2956396c8f897d22084c0ed520f1cc7a8b4c" Feb 16 19:47:11 crc kubenswrapper[4675]: E0216 19:47:11.826980 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"023a978df3fd78c68a34a1312e8a2956396c8f897d22084c0ed520f1cc7a8b4c\": container with ID starting with 023a978df3fd78c68a34a1312e8a2956396c8f897d22084c0ed520f1cc7a8b4c not found: ID does not exist" containerID="023a978df3fd78c68a34a1312e8a2956396c8f897d22084c0ed520f1cc7a8b4c" Feb 16 19:47:11 crc kubenswrapper[4675]: I0216 19:47:11.827100 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"023a978df3fd78c68a34a1312e8a2956396c8f897d22084c0ed520f1cc7a8b4c"} err="failed to get container status \"023a978df3fd78c68a34a1312e8a2956396c8f897d22084c0ed520f1cc7a8b4c\": rpc error: code = NotFound desc = could not find container \"023a978df3fd78c68a34a1312e8a2956396c8f897d22084c0ed520f1cc7a8b4c\": container with ID starting with 023a978df3fd78c68a34a1312e8a2956396c8f897d22084c0ed520f1cc7a8b4c not found: ID does not exist" Feb 16 19:47:11 crc kubenswrapper[4675]: I0216 19:47:11.827152 4675 scope.go:117] "RemoveContainer" containerID="fc51b6e99af5497842f7ecf7064e700c700e56f017af21b26577194c6965e3d9" Feb 16 19:47:11 crc kubenswrapper[4675]: I0216 19:47:11.844145 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-q894z"] Feb 16 19:47:11 crc kubenswrapper[4675]: I0216 19:47:11.857670 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-q894z"] Feb 16 19:47:11 crc kubenswrapper[4675]: I0216 19:47:11.866448 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-qzcjb"] Feb 16 19:47:11 crc kubenswrapper[4675]: I0216 19:47:11.870487 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-qzcjb"] Feb 16 19:47:11 crc kubenswrapper[4675]: I0216 19:47:11.876490 4675 scope.go:117] "RemoveContainer" containerID="fc51b6e99af5497842f7ecf7064e700c700e56f017af21b26577194c6965e3d9" Feb 16 19:47:11 crc kubenswrapper[4675]: E0216 19:47:11.877159 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc51b6e99af5497842f7ecf7064e700c700e56f017af21b26577194c6965e3d9\": container with ID starting with fc51b6e99af5497842f7ecf7064e700c700e56f017af21b26577194c6965e3d9 not found: ID does not exist" containerID="fc51b6e99af5497842f7ecf7064e700c700e56f017af21b26577194c6965e3d9" Feb 16 19:47:11 crc kubenswrapper[4675]: I0216 19:47:11.877366 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc51b6e99af5497842f7ecf7064e700c700e56f017af21b26577194c6965e3d9"} err="failed to get container status \"fc51b6e99af5497842f7ecf7064e700c700e56f017af21b26577194c6965e3d9\": rpc error: code = NotFound desc = could not find container \"fc51b6e99af5497842f7ecf7064e700c700e56f017af21b26577194c6965e3d9\": container with ID starting with fc51b6e99af5497842f7ecf7064e700c700e56f017af21b26577194c6965e3d9 not found: ID does not exist" Feb 16 19:47:11 crc kubenswrapper[4675]: I0216 19:47:11.891057 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9de00f8-6995-42d6-ad28-1961096e55c0" path="/var/lib/kubelet/pods/d9de00f8-6995-42d6-ad28-1961096e55c0/volumes" Feb 16 19:47:11 crc kubenswrapper[4675]: I0216 19:47:11.891797 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f80b9c97-1cb5-427f-9b9f-4b62316d9e7a" path="/var/lib/kubelet/pods/f80b9c97-1cb5-427f-9b9f-4b62316d9e7a/volumes" Feb 16 19:47:12 crc kubenswrapper[4675]: I0216 19:47:12.702353 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f5c4ff7d7-bxzn7"] Feb 16 19:47:12 crc kubenswrapper[4675]: E0216 19:47:12.702961 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9de00f8-6995-42d6-ad28-1961096e55c0" containerName="controller-manager" Feb 16 19:47:12 crc kubenswrapper[4675]: I0216 19:47:12.702993 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9de00f8-6995-42d6-ad28-1961096e55c0" containerName="controller-manager" Feb 16 19:47:12 crc kubenswrapper[4675]: E0216 19:47:12.703013 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fecbc375-b333-42c1-bff9-3d17abec2eb2" containerName="extract-utilities" Feb 16 19:47:12 crc kubenswrapper[4675]: I0216 19:47:12.703048 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="fecbc375-b333-42c1-bff9-3d17abec2eb2" containerName="extract-utilities" Feb 16 19:47:12 crc kubenswrapper[4675]: E0216 19:47:12.703061 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a833f00b-0fcb-416e-be3e-c4344adeef8d" containerName="registry-server" Feb 16 19:47:12 crc kubenswrapper[4675]: I0216 19:47:12.703072 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="a833f00b-0fcb-416e-be3e-c4344adeef8d" containerName="registry-server" Feb 16 19:47:12 crc kubenswrapper[4675]: E0216 19:47:12.703089 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fecbc375-b333-42c1-bff9-3d17abec2eb2" containerName="registry-server" Feb 16 19:47:12 crc kubenswrapper[4675]: I0216 19:47:12.703101 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="fecbc375-b333-42c1-bff9-3d17abec2eb2" containerName="registry-server" Feb 16 19:47:12 crc kubenswrapper[4675]: E0216 19:47:12.703140 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee9cd13e-3dd5-4fbf-8989-965350dff2e8" containerName="extract-content" Feb 16 19:47:12 crc kubenswrapper[4675]: I0216 19:47:12.703150 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee9cd13e-3dd5-4fbf-8989-965350dff2e8" containerName="extract-content" Feb 16 19:47:12 crc kubenswrapper[4675]: E0216 19:47:12.703167 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="408894c5-c798-48ff-93ac-bc8ea114ee4a" containerName="marketplace-operator" Feb 16 19:47:12 crc kubenswrapper[4675]: I0216 19:47:12.703176 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="408894c5-c798-48ff-93ac-bc8ea114ee4a" containerName="marketplace-operator" Feb 16 19:47:12 crc kubenswrapper[4675]: E0216 19:47:12.703214 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb2e7e26-8ed8-4cec-8f78-7d02cce6bb33" containerName="extract-utilities" Feb 16 19:47:12 crc kubenswrapper[4675]: I0216 19:47:12.703226 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb2e7e26-8ed8-4cec-8f78-7d02cce6bb33" containerName="extract-utilities" Feb 16 19:47:12 crc kubenswrapper[4675]: E0216 19:47:12.703244 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a833f00b-0fcb-416e-be3e-c4344adeef8d" containerName="extract-content" Feb 16 19:47:12 crc kubenswrapper[4675]: I0216 19:47:12.703258 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="a833f00b-0fcb-416e-be3e-c4344adeef8d" containerName="extract-content" Feb 16 19:47:12 crc kubenswrapper[4675]: E0216 19:47:12.703319 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f80b9c97-1cb5-427f-9b9f-4b62316d9e7a" containerName="route-controller-manager" Feb 16 19:47:12 crc kubenswrapper[4675]: I0216 19:47:12.703335 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="f80b9c97-1cb5-427f-9b9f-4b62316d9e7a" containerName="route-controller-manager" Feb 16 19:47:12 crc kubenswrapper[4675]: E0216 19:47:12.703349 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fecbc375-b333-42c1-bff9-3d17abec2eb2" containerName="extract-content" Feb 16 19:47:12 crc kubenswrapper[4675]: I0216 19:47:12.703394 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="fecbc375-b333-42c1-bff9-3d17abec2eb2" containerName="extract-content" Feb 16 19:47:12 crc kubenswrapper[4675]: E0216 19:47:12.703413 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee9cd13e-3dd5-4fbf-8989-965350dff2e8" containerName="registry-server" Feb 16 19:47:12 crc kubenswrapper[4675]: I0216 19:47:12.703425 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee9cd13e-3dd5-4fbf-8989-965350dff2e8" containerName="registry-server" Feb 16 19:47:12 crc kubenswrapper[4675]: E0216 19:47:12.703438 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee9cd13e-3dd5-4fbf-8989-965350dff2e8" containerName="extract-utilities" Feb 16 19:47:12 crc kubenswrapper[4675]: I0216 19:47:12.703481 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee9cd13e-3dd5-4fbf-8989-965350dff2e8" containerName="extract-utilities" Feb 16 19:47:12 crc kubenswrapper[4675]: E0216 19:47:12.703499 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb2e7e26-8ed8-4cec-8f78-7d02cce6bb33" containerName="registry-server" Feb 16 19:47:12 crc kubenswrapper[4675]: I0216 19:47:12.703514 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb2e7e26-8ed8-4cec-8f78-7d02cce6bb33" containerName="registry-server" Feb 16 19:47:12 crc kubenswrapper[4675]: E0216 19:47:12.703530 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a833f00b-0fcb-416e-be3e-c4344adeef8d" containerName="extract-utilities" Feb 16 19:47:12 crc kubenswrapper[4675]: I0216 19:47:12.703573 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="a833f00b-0fcb-416e-be3e-c4344adeef8d" containerName="extract-utilities" Feb 16 19:47:12 crc kubenswrapper[4675]: E0216 19:47:12.703590 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb2e7e26-8ed8-4cec-8f78-7d02cce6bb33" containerName="extract-content" Feb 16 19:47:12 crc kubenswrapper[4675]: I0216 19:47:12.703602 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb2e7e26-8ed8-4cec-8f78-7d02cce6bb33" containerName="extract-content" Feb 16 19:47:12 crc kubenswrapper[4675]: I0216 19:47:12.703907 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9de00f8-6995-42d6-ad28-1961096e55c0" containerName="controller-manager" Feb 16 19:47:12 crc kubenswrapper[4675]: I0216 19:47:12.703925 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="fecbc375-b333-42c1-bff9-3d17abec2eb2" containerName="registry-server" Feb 16 19:47:12 crc kubenswrapper[4675]: I0216 19:47:12.703940 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb2e7e26-8ed8-4cec-8f78-7d02cce6bb33" containerName="registry-server" Feb 16 19:47:12 crc kubenswrapper[4675]: I0216 19:47:12.703982 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="f80b9c97-1cb5-427f-9b9f-4b62316d9e7a" containerName="route-controller-manager" Feb 16 19:47:12 crc kubenswrapper[4675]: I0216 19:47:12.703997 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="a833f00b-0fcb-416e-be3e-c4344adeef8d" containerName="registry-server" Feb 16 19:47:12 crc kubenswrapper[4675]: I0216 19:47:12.704008 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="408894c5-c798-48ff-93ac-bc8ea114ee4a" containerName="marketplace-operator" Feb 16 19:47:12 crc kubenswrapper[4675]: I0216 19:47:12.704019 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee9cd13e-3dd5-4fbf-8989-965350dff2e8" containerName="registry-server" Feb 16 19:47:12 crc kubenswrapper[4675]: I0216 19:47:12.705091 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f5c4ff7d7-bxzn7" Feb 16 19:47:12 crc kubenswrapper[4675]: I0216 19:47:12.708098 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 16 19:47:12 crc kubenswrapper[4675]: I0216 19:47:12.709154 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-d944948c5-jb8c8"] Feb 16 19:47:12 crc kubenswrapper[4675]: I0216 19:47:12.711368 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-d944948c5-jb8c8" Feb 16 19:47:12 crc kubenswrapper[4675]: I0216 19:47:12.712453 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 16 19:47:12 crc kubenswrapper[4675]: I0216 19:47:12.713271 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 16 19:47:12 crc kubenswrapper[4675]: I0216 19:47:12.713842 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 16 19:47:12 crc kubenswrapper[4675]: I0216 19:47:12.713927 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 16 19:47:12 crc kubenswrapper[4675]: I0216 19:47:12.714204 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 16 19:47:12 crc kubenswrapper[4675]: I0216 19:47:12.714829 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 16 19:47:12 crc kubenswrapper[4675]: I0216 19:47:12.715048 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 16 19:47:12 crc kubenswrapper[4675]: I0216 19:47:12.715267 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 16 19:47:12 crc kubenswrapper[4675]: I0216 19:47:12.718361 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 16 19:47:12 crc kubenswrapper[4675]: I0216 19:47:12.718764 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 16 19:47:12 crc kubenswrapper[4675]: I0216 19:47:12.719940 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 16 19:47:12 crc kubenswrapper[4675]: I0216 19:47:12.726957 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-d944948c5-jb8c8"] Feb 16 19:47:12 crc kubenswrapper[4675]: I0216 19:47:12.732015 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f5c4ff7d7-bxzn7"] Feb 16 19:47:12 crc kubenswrapper[4675]: I0216 19:47:12.734561 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 16 19:47:12 crc kubenswrapper[4675]: I0216 19:47:12.784849 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45fs9\" (UniqueName: \"kubernetes.io/projected/cf76ff91-7c18-4616-ba44-34c4efd8a97e-kube-api-access-45fs9\") pod \"controller-manager-d944948c5-jb8c8\" (UID: \"cf76ff91-7c18-4616-ba44-34c4efd8a97e\") " pod="openshift-controller-manager/controller-manager-d944948c5-jb8c8" Feb 16 19:47:12 crc kubenswrapper[4675]: I0216 19:47:12.785410 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87afba8c-5880-44e5-9c75-c628a1b82d5d-config\") pod \"route-controller-manager-f5c4ff7d7-bxzn7\" (UID: \"87afba8c-5880-44e5-9c75-c628a1b82d5d\") " pod="openshift-route-controller-manager/route-controller-manager-f5c4ff7d7-bxzn7" Feb 16 19:47:12 crc kubenswrapper[4675]: I0216 19:47:12.785572 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf76ff91-7c18-4616-ba44-34c4efd8a97e-config\") pod \"controller-manager-d944948c5-jb8c8\" (UID: \"cf76ff91-7c18-4616-ba44-34c4efd8a97e\") " pod="openshift-controller-manager/controller-manager-d944948c5-jb8c8" Feb 16 19:47:12 crc kubenswrapper[4675]: I0216 19:47:12.785725 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cf76ff91-7c18-4616-ba44-34c4efd8a97e-proxy-ca-bundles\") pod \"controller-manager-d944948c5-jb8c8\" (UID: \"cf76ff91-7c18-4616-ba44-34c4efd8a97e\") " pod="openshift-controller-manager/controller-manager-d944948c5-jb8c8" Feb 16 19:47:12 crc kubenswrapper[4675]: I0216 19:47:12.785880 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/87afba8c-5880-44e5-9c75-c628a1b82d5d-client-ca\") pod \"route-controller-manager-f5c4ff7d7-bxzn7\" (UID: \"87afba8c-5880-44e5-9c75-c628a1b82d5d\") " pod="openshift-route-controller-manager/route-controller-manager-f5c4ff7d7-bxzn7" Feb 16 19:47:12 crc kubenswrapper[4675]: I0216 19:47:12.786004 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf76ff91-7c18-4616-ba44-34c4efd8a97e-serving-cert\") pod \"controller-manager-d944948c5-jb8c8\" (UID: \"cf76ff91-7c18-4616-ba44-34c4efd8a97e\") " pod="openshift-controller-manager/controller-manager-d944948c5-jb8c8" Feb 16 19:47:12 crc kubenswrapper[4675]: I0216 19:47:12.786149 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/87afba8c-5880-44e5-9c75-c628a1b82d5d-serving-cert\") pod \"route-controller-manager-f5c4ff7d7-bxzn7\" (UID: \"87afba8c-5880-44e5-9c75-c628a1b82d5d\") " pod="openshift-route-controller-manager/route-controller-manager-f5c4ff7d7-bxzn7" Feb 16 19:47:12 crc kubenswrapper[4675]: I0216 19:47:12.786283 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cf76ff91-7c18-4616-ba44-34c4efd8a97e-client-ca\") pod \"controller-manager-d944948c5-jb8c8\" (UID: \"cf76ff91-7c18-4616-ba44-34c4efd8a97e\") " pod="openshift-controller-manager/controller-manager-d944948c5-jb8c8" Feb 16 19:47:12 crc kubenswrapper[4675]: I0216 19:47:12.786406 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xk479\" (UniqueName: \"kubernetes.io/projected/87afba8c-5880-44e5-9c75-c628a1b82d5d-kube-api-access-xk479\") pod \"route-controller-manager-f5c4ff7d7-bxzn7\" (UID: \"87afba8c-5880-44e5-9c75-c628a1b82d5d\") " pod="openshift-route-controller-manager/route-controller-manager-f5c4ff7d7-bxzn7" Feb 16 19:47:12 crc kubenswrapper[4675]: I0216 19:47:12.887236 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/87afba8c-5880-44e5-9c75-c628a1b82d5d-client-ca\") pod \"route-controller-manager-f5c4ff7d7-bxzn7\" (UID: \"87afba8c-5880-44e5-9c75-c628a1b82d5d\") " pod="openshift-route-controller-manager/route-controller-manager-f5c4ff7d7-bxzn7" Feb 16 19:47:12 crc kubenswrapper[4675]: I0216 19:47:12.887292 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf76ff91-7c18-4616-ba44-34c4efd8a97e-serving-cert\") pod \"controller-manager-d944948c5-jb8c8\" (UID: \"cf76ff91-7c18-4616-ba44-34c4efd8a97e\") " pod="openshift-controller-manager/controller-manager-d944948c5-jb8c8" Feb 16 19:47:12 crc kubenswrapper[4675]: I0216 19:47:12.887339 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/87afba8c-5880-44e5-9c75-c628a1b82d5d-serving-cert\") pod \"route-controller-manager-f5c4ff7d7-bxzn7\" (UID: \"87afba8c-5880-44e5-9c75-c628a1b82d5d\") " pod="openshift-route-controller-manager/route-controller-manager-f5c4ff7d7-bxzn7" Feb 16 19:47:12 crc kubenswrapper[4675]: I0216 19:47:12.887367 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cf76ff91-7c18-4616-ba44-34c4efd8a97e-client-ca\") pod \"controller-manager-d944948c5-jb8c8\" (UID: \"cf76ff91-7c18-4616-ba44-34c4efd8a97e\") " pod="openshift-controller-manager/controller-manager-d944948c5-jb8c8" Feb 16 19:47:12 crc kubenswrapper[4675]: I0216 19:47:12.887395 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xk479\" (UniqueName: \"kubernetes.io/projected/87afba8c-5880-44e5-9c75-c628a1b82d5d-kube-api-access-xk479\") pod \"route-controller-manager-f5c4ff7d7-bxzn7\" (UID: \"87afba8c-5880-44e5-9c75-c628a1b82d5d\") " pod="openshift-route-controller-manager/route-controller-manager-f5c4ff7d7-bxzn7" Feb 16 19:47:12 crc kubenswrapper[4675]: I0216 19:47:12.887424 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45fs9\" (UniqueName: \"kubernetes.io/projected/cf76ff91-7c18-4616-ba44-34c4efd8a97e-kube-api-access-45fs9\") pod \"controller-manager-d944948c5-jb8c8\" (UID: \"cf76ff91-7c18-4616-ba44-34c4efd8a97e\") " pod="openshift-controller-manager/controller-manager-d944948c5-jb8c8" Feb 16 19:47:12 crc kubenswrapper[4675]: I0216 19:47:12.887469 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87afba8c-5880-44e5-9c75-c628a1b82d5d-config\") pod \"route-controller-manager-f5c4ff7d7-bxzn7\" (UID: \"87afba8c-5880-44e5-9c75-c628a1b82d5d\") " pod="openshift-route-controller-manager/route-controller-manager-f5c4ff7d7-bxzn7" Feb 16 19:47:12 crc kubenswrapper[4675]: I0216 19:47:12.887500 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf76ff91-7c18-4616-ba44-34c4efd8a97e-config\") pod \"controller-manager-d944948c5-jb8c8\" (UID: \"cf76ff91-7c18-4616-ba44-34c4efd8a97e\") " pod="openshift-controller-manager/controller-manager-d944948c5-jb8c8" Feb 16 19:47:12 crc kubenswrapper[4675]: I0216 19:47:12.887529 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cf76ff91-7c18-4616-ba44-34c4efd8a97e-proxy-ca-bundles\") pod \"controller-manager-d944948c5-jb8c8\" (UID: \"cf76ff91-7c18-4616-ba44-34c4efd8a97e\") " pod="openshift-controller-manager/controller-manager-d944948c5-jb8c8" Feb 16 19:47:12 crc kubenswrapper[4675]: I0216 19:47:12.889391 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cf76ff91-7c18-4616-ba44-34c4efd8a97e-proxy-ca-bundles\") pod \"controller-manager-d944948c5-jb8c8\" (UID: \"cf76ff91-7c18-4616-ba44-34c4efd8a97e\") " pod="openshift-controller-manager/controller-manager-d944948c5-jb8c8" Feb 16 19:47:12 crc kubenswrapper[4675]: I0216 19:47:12.889879 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/87afba8c-5880-44e5-9c75-c628a1b82d5d-client-ca\") pod \"route-controller-manager-f5c4ff7d7-bxzn7\" (UID: \"87afba8c-5880-44e5-9c75-c628a1b82d5d\") " pod="openshift-route-controller-manager/route-controller-manager-f5c4ff7d7-bxzn7" Feb 16 19:47:12 crc kubenswrapper[4675]: I0216 19:47:12.890815 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87afba8c-5880-44e5-9c75-c628a1b82d5d-config\") pod \"route-controller-manager-f5c4ff7d7-bxzn7\" (UID: \"87afba8c-5880-44e5-9c75-c628a1b82d5d\") " pod="openshift-route-controller-manager/route-controller-manager-f5c4ff7d7-bxzn7" Feb 16 19:47:12 crc kubenswrapper[4675]: I0216 19:47:12.893020 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf76ff91-7c18-4616-ba44-34c4efd8a97e-serving-cert\") pod \"controller-manager-d944948c5-jb8c8\" (UID: \"cf76ff91-7c18-4616-ba44-34c4efd8a97e\") " pod="openshift-controller-manager/controller-manager-d944948c5-jb8c8" Feb 16 19:47:12 crc kubenswrapper[4675]: I0216 19:47:12.895296 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cf76ff91-7c18-4616-ba44-34c4efd8a97e-client-ca\") pod \"controller-manager-d944948c5-jb8c8\" (UID: \"cf76ff91-7c18-4616-ba44-34c4efd8a97e\") " pod="openshift-controller-manager/controller-manager-d944948c5-jb8c8" Feb 16 19:47:12 crc kubenswrapper[4675]: I0216 19:47:12.897074 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf76ff91-7c18-4616-ba44-34c4efd8a97e-config\") pod \"controller-manager-d944948c5-jb8c8\" (UID: \"cf76ff91-7c18-4616-ba44-34c4efd8a97e\") " pod="openshift-controller-manager/controller-manager-d944948c5-jb8c8" Feb 16 19:47:12 crc kubenswrapper[4675]: I0216 19:47:12.901112 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/87afba8c-5880-44e5-9c75-c628a1b82d5d-serving-cert\") pod \"route-controller-manager-f5c4ff7d7-bxzn7\" (UID: \"87afba8c-5880-44e5-9c75-c628a1b82d5d\") " pod="openshift-route-controller-manager/route-controller-manager-f5c4ff7d7-bxzn7" Feb 16 19:47:12 crc kubenswrapper[4675]: I0216 19:47:12.921278 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45fs9\" (UniqueName: \"kubernetes.io/projected/cf76ff91-7c18-4616-ba44-34c4efd8a97e-kube-api-access-45fs9\") pod \"controller-manager-d944948c5-jb8c8\" (UID: \"cf76ff91-7c18-4616-ba44-34c4efd8a97e\") " pod="openshift-controller-manager/controller-manager-d944948c5-jb8c8" Feb 16 19:47:12 crc kubenswrapper[4675]: I0216 19:47:12.924029 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xk479\" (UniqueName: \"kubernetes.io/projected/87afba8c-5880-44e5-9c75-c628a1b82d5d-kube-api-access-xk479\") pod \"route-controller-manager-f5c4ff7d7-bxzn7\" (UID: \"87afba8c-5880-44e5-9c75-c628a1b82d5d\") " pod="openshift-route-controller-manager/route-controller-manager-f5c4ff7d7-bxzn7" Feb 16 19:47:13 crc kubenswrapper[4675]: I0216 19:47:13.047251 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f5c4ff7d7-bxzn7" Feb 16 19:47:13 crc kubenswrapper[4675]: I0216 19:47:13.063245 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-d944948c5-jb8c8" Feb 16 19:47:13 crc kubenswrapper[4675]: I0216 19:47:13.260786 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f5c4ff7d7-bxzn7"] Feb 16 19:47:13 crc kubenswrapper[4675]: I0216 19:47:13.313363 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-d944948c5-jb8c8"] Feb 16 19:47:13 crc kubenswrapper[4675]: W0216 19:47:13.326093 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf76ff91_7c18_4616_ba44_34c4efd8a97e.slice/crio-5bf7e415479c8c3e658b00db0c4b85280ed84e4240d14116c3cc89fef716151c WatchSource:0}: Error finding container 5bf7e415479c8c3e658b00db0c4b85280ed84e4240d14116c3cc89fef716151c: Status 404 returned error can't find the container with id 5bf7e415479c8c3e658b00db0c4b85280ed84e4240d14116c3cc89fef716151c Feb 16 19:47:13 crc kubenswrapper[4675]: I0216 19:47:13.816142 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f5c4ff7d7-bxzn7" event={"ID":"87afba8c-5880-44e5-9c75-c628a1b82d5d","Type":"ContainerStarted","Data":"cd56e944e240207fdce8f2bff59ae1329793a921c2ef9a96d460c7a39eaff84b"} Feb 16 19:47:13 crc kubenswrapper[4675]: I0216 19:47:13.816236 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f5c4ff7d7-bxzn7" event={"ID":"87afba8c-5880-44e5-9c75-c628a1b82d5d","Type":"ContainerStarted","Data":"c1580b693833d806098ad3cd09a7aadd6193347263df99a0990193b3699af86a"} Feb 16 19:47:13 crc kubenswrapper[4675]: I0216 19:47:13.816465 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-f5c4ff7d7-bxzn7" Feb 16 19:47:13 crc kubenswrapper[4675]: I0216 19:47:13.818722 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-d944948c5-jb8c8" event={"ID":"cf76ff91-7c18-4616-ba44-34c4efd8a97e","Type":"ContainerStarted","Data":"116afe54c0e6302c95f6d8da999959157fac64c82d6584aeaebae4d29e32f573"} Feb 16 19:47:13 crc kubenswrapper[4675]: I0216 19:47:13.818755 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-d944948c5-jb8c8" event={"ID":"cf76ff91-7c18-4616-ba44-34c4efd8a97e","Type":"ContainerStarted","Data":"5bf7e415479c8c3e658b00db0c4b85280ed84e4240d14116c3cc89fef716151c"} Feb 16 19:47:13 crc kubenswrapper[4675]: I0216 19:47:13.818936 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-d944948c5-jb8c8" Feb 16 19:47:13 crc kubenswrapper[4675]: I0216 19:47:13.826202 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-d944948c5-jb8c8" Feb 16 19:47:13 crc kubenswrapper[4675]: I0216 19:47:13.838301 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-f5c4ff7d7-bxzn7" podStartSLOduration=2.838278653 podStartE2EDuration="2.838278653s" podCreationTimestamp="2026-02-16 19:47:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 19:47:13.837554664 +0000 UTC m=+316.962844230" watchObservedRunningTime="2026-02-16 19:47:13.838278653 +0000 UTC m=+316.963568239" Feb 16 19:47:13 crc kubenswrapper[4675]: I0216 19:47:13.853025 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-d944948c5-jb8c8" podStartSLOduration=2.85299989 podStartE2EDuration="2.85299989s" podCreationTimestamp="2026-02-16 19:47:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 19:47:13.851733427 +0000 UTC m=+316.977022993" watchObservedRunningTime="2026-02-16 19:47:13.85299989 +0000 UTC m=+316.978289446" Feb 16 19:47:14 crc kubenswrapper[4675]: I0216 19:47:14.205801 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-f5c4ff7d7-bxzn7" Feb 16 19:47:16 crc kubenswrapper[4675]: I0216 19:47:16.629572 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-d944948c5-jb8c8"] Feb 16 19:47:16 crc kubenswrapper[4675]: I0216 19:47:16.846162 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-d944948c5-jb8c8" podUID="cf76ff91-7c18-4616-ba44-34c4efd8a97e" containerName="controller-manager" containerID="cri-o://116afe54c0e6302c95f6d8da999959157fac64c82d6584aeaebae4d29e32f573" gracePeriod=30 Feb 16 19:47:17 crc kubenswrapper[4675]: I0216 19:47:17.308901 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-d944948c5-jb8c8" Feb 16 19:47:17 crc kubenswrapper[4675]: I0216 19:47:17.357540 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf76ff91-7c18-4616-ba44-34c4efd8a97e-serving-cert\") pod \"cf76ff91-7c18-4616-ba44-34c4efd8a97e\" (UID: \"cf76ff91-7c18-4616-ba44-34c4efd8a97e\") " Feb 16 19:47:17 crc kubenswrapper[4675]: I0216 19:47:17.357729 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cf76ff91-7c18-4616-ba44-34c4efd8a97e-proxy-ca-bundles\") pod \"cf76ff91-7c18-4616-ba44-34c4efd8a97e\" (UID: \"cf76ff91-7c18-4616-ba44-34c4efd8a97e\") " Feb 16 19:47:17 crc kubenswrapper[4675]: I0216 19:47:17.357767 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cf76ff91-7c18-4616-ba44-34c4efd8a97e-client-ca\") pod \"cf76ff91-7c18-4616-ba44-34c4efd8a97e\" (UID: \"cf76ff91-7c18-4616-ba44-34c4efd8a97e\") " Feb 16 19:47:17 crc kubenswrapper[4675]: I0216 19:47:17.357814 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf76ff91-7c18-4616-ba44-34c4efd8a97e-config\") pod \"cf76ff91-7c18-4616-ba44-34c4efd8a97e\" (UID: \"cf76ff91-7c18-4616-ba44-34c4efd8a97e\") " Feb 16 19:47:17 crc kubenswrapper[4675]: I0216 19:47:17.357842 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45fs9\" (UniqueName: \"kubernetes.io/projected/cf76ff91-7c18-4616-ba44-34c4efd8a97e-kube-api-access-45fs9\") pod \"cf76ff91-7c18-4616-ba44-34c4efd8a97e\" (UID: \"cf76ff91-7c18-4616-ba44-34c4efd8a97e\") " Feb 16 19:47:17 crc kubenswrapper[4675]: I0216 19:47:17.358929 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf76ff91-7c18-4616-ba44-34c4efd8a97e-client-ca" (OuterVolumeSpecName: "client-ca") pod "cf76ff91-7c18-4616-ba44-34c4efd8a97e" (UID: "cf76ff91-7c18-4616-ba44-34c4efd8a97e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 19:47:17 crc kubenswrapper[4675]: I0216 19:47:17.359034 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf76ff91-7c18-4616-ba44-34c4efd8a97e-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "cf76ff91-7c18-4616-ba44-34c4efd8a97e" (UID: "cf76ff91-7c18-4616-ba44-34c4efd8a97e"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 19:47:17 crc kubenswrapper[4675]: I0216 19:47:17.359086 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf76ff91-7c18-4616-ba44-34c4efd8a97e-config" (OuterVolumeSpecName: "config") pod "cf76ff91-7c18-4616-ba44-34c4efd8a97e" (UID: "cf76ff91-7c18-4616-ba44-34c4efd8a97e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 19:47:17 crc kubenswrapper[4675]: I0216 19:47:17.366041 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf76ff91-7c18-4616-ba44-34c4efd8a97e-kube-api-access-45fs9" (OuterVolumeSpecName: "kube-api-access-45fs9") pod "cf76ff91-7c18-4616-ba44-34c4efd8a97e" (UID: "cf76ff91-7c18-4616-ba44-34c4efd8a97e"). InnerVolumeSpecName "kube-api-access-45fs9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 19:47:17 crc kubenswrapper[4675]: I0216 19:47:17.366935 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf76ff91-7c18-4616-ba44-34c4efd8a97e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "cf76ff91-7c18-4616-ba44-34c4efd8a97e" (UID: "cf76ff91-7c18-4616-ba44-34c4efd8a97e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 19:47:17 crc kubenswrapper[4675]: I0216 19:47:17.458728 4675 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf76ff91-7c18-4616-ba44-34c4efd8a97e-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 16 19:47:17 crc kubenswrapper[4675]: I0216 19:47:17.458774 4675 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cf76ff91-7c18-4616-ba44-34c4efd8a97e-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 16 19:47:17 crc kubenswrapper[4675]: I0216 19:47:17.458791 4675 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cf76ff91-7c18-4616-ba44-34c4efd8a97e-client-ca\") on node \"crc\" DevicePath \"\"" Feb 16 19:47:17 crc kubenswrapper[4675]: I0216 19:47:17.458804 4675 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf76ff91-7c18-4616-ba44-34c4efd8a97e-config\") on node \"crc\" DevicePath \"\"" Feb 16 19:47:17 crc kubenswrapper[4675]: I0216 19:47:17.458815 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45fs9\" (UniqueName: \"kubernetes.io/projected/cf76ff91-7c18-4616-ba44-34c4efd8a97e-kube-api-access-45fs9\") on node \"crc\" DevicePath \"\"" Feb 16 19:47:17 crc kubenswrapper[4675]: I0216 19:47:17.705059 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6f8485cb6-dbkm2"] Feb 16 19:47:17 crc kubenswrapper[4675]: E0216 19:47:17.705331 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf76ff91-7c18-4616-ba44-34c4efd8a97e" containerName="controller-manager" Feb 16 19:47:17 crc kubenswrapper[4675]: I0216 19:47:17.705349 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf76ff91-7c18-4616-ba44-34c4efd8a97e" containerName="controller-manager" Feb 16 19:47:17 crc kubenswrapper[4675]: I0216 19:47:17.705498 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf76ff91-7c18-4616-ba44-34c4efd8a97e" containerName="controller-manager" Feb 16 19:47:17 crc kubenswrapper[4675]: I0216 19:47:17.706082 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f8485cb6-dbkm2" Feb 16 19:47:17 crc kubenswrapper[4675]: I0216 19:47:17.721738 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6f8485cb6-dbkm2"] Feb 16 19:47:17 crc kubenswrapper[4675]: I0216 19:47:17.856465 4675 generic.go:334] "Generic (PLEG): container finished" podID="cf76ff91-7c18-4616-ba44-34c4efd8a97e" containerID="116afe54c0e6302c95f6d8da999959157fac64c82d6584aeaebae4d29e32f573" exitCode=0 Feb 16 19:47:17 crc kubenswrapper[4675]: I0216 19:47:17.856531 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-d944948c5-jb8c8" event={"ID":"cf76ff91-7c18-4616-ba44-34c4efd8a97e","Type":"ContainerDied","Data":"116afe54c0e6302c95f6d8da999959157fac64c82d6584aeaebae4d29e32f573"} Feb 16 19:47:17 crc kubenswrapper[4675]: I0216 19:47:17.856608 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-d944948c5-jb8c8" event={"ID":"cf76ff91-7c18-4616-ba44-34c4efd8a97e","Type":"ContainerDied","Data":"5bf7e415479c8c3e658b00db0c4b85280ed84e4240d14116c3cc89fef716151c"} Feb 16 19:47:17 crc kubenswrapper[4675]: I0216 19:47:17.856665 4675 scope.go:117] "RemoveContainer" containerID="116afe54c0e6302c95f6d8da999959157fac64c82d6584aeaebae4d29e32f573" Feb 16 19:47:17 crc kubenswrapper[4675]: I0216 19:47:17.857041 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-d944948c5-jb8c8" Feb 16 19:47:17 crc kubenswrapper[4675]: I0216 19:47:17.863878 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bf5ebe2b-bd1b-4f2b-ba68-ed405a611073-client-ca\") pod \"controller-manager-6f8485cb6-dbkm2\" (UID: \"bf5ebe2b-bd1b-4f2b-ba68-ed405a611073\") " pod="openshift-controller-manager/controller-manager-6f8485cb6-dbkm2" Feb 16 19:47:17 crc kubenswrapper[4675]: I0216 19:47:17.863993 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bf5ebe2b-bd1b-4f2b-ba68-ed405a611073-proxy-ca-bundles\") pod \"controller-manager-6f8485cb6-dbkm2\" (UID: \"bf5ebe2b-bd1b-4f2b-ba68-ed405a611073\") " pod="openshift-controller-manager/controller-manager-6f8485cb6-dbkm2" Feb 16 19:47:17 crc kubenswrapper[4675]: I0216 19:47:17.864054 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxpcf\" (UniqueName: \"kubernetes.io/projected/bf5ebe2b-bd1b-4f2b-ba68-ed405a611073-kube-api-access-zxpcf\") pod \"controller-manager-6f8485cb6-dbkm2\" (UID: \"bf5ebe2b-bd1b-4f2b-ba68-ed405a611073\") " pod="openshift-controller-manager/controller-manager-6f8485cb6-dbkm2" Feb 16 19:47:17 crc kubenswrapper[4675]: I0216 19:47:17.864298 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf5ebe2b-bd1b-4f2b-ba68-ed405a611073-config\") pod \"controller-manager-6f8485cb6-dbkm2\" (UID: \"bf5ebe2b-bd1b-4f2b-ba68-ed405a611073\") " pod="openshift-controller-manager/controller-manager-6f8485cb6-dbkm2" Feb 16 19:47:17 crc kubenswrapper[4675]: I0216 19:47:17.864376 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf5ebe2b-bd1b-4f2b-ba68-ed405a611073-serving-cert\") pod \"controller-manager-6f8485cb6-dbkm2\" (UID: \"bf5ebe2b-bd1b-4f2b-ba68-ed405a611073\") " pod="openshift-controller-manager/controller-manager-6f8485cb6-dbkm2" Feb 16 19:47:17 crc kubenswrapper[4675]: I0216 19:47:17.884828 4675 scope.go:117] "RemoveContainer" containerID="116afe54c0e6302c95f6d8da999959157fac64c82d6584aeaebae4d29e32f573" Feb 16 19:47:17 crc kubenswrapper[4675]: E0216 19:47:17.885608 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"116afe54c0e6302c95f6d8da999959157fac64c82d6584aeaebae4d29e32f573\": container with ID starting with 116afe54c0e6302c95f6d8da999959157fac64c82d6584aeaebae4d29e32f573 not found: ID does not exist" containerID="116afe54c0e6302c95f6d8da999959157fac64c82d6584aeaebae4d29e32f573" Feb 16 19:47:17 crc kubenswrapper[4675]: I0216 19:47:17.885872 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"116afe54c0e6302c95f6d8da999959157fac64c82d6584aeaebae4d29e32f573"} err="failed to get container status \"116afe54c0e6302c95f6d8da999959157fac64c82d6584aeaebae4d29e32f573\": rpc error: code = NotFound desc = could not find container \"116afe54c0e6302c95f6d8da999959157fac64c82d6584aeaebae4d29e32f573\": container with ID starting with 116afe54c0e6302c95f6d8da999959157fac64c82d6584aeaebae4d29e32f573 not found: ID does not exist" Feb 16 19:47:17 crc kubenswrapper[4675]: I0216 19:47:17.913383 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-d944948c5-jb8c8"] Feb 16 19:47:17 crc kubenswrapper[4675]: I0216 19:47:17.917953 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-d944948c5-jb8c8"] Feb 16 19:47:17 crc kubenswrapper[4675]: I0216 19:47:17.965317 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bf5ebe2b-bd1b-4f2b-ba68-ed405a611073-client-ca\") pod \"controller-manager-6f8485cb6-dbkm2\" (UID: \"bf5ebe2b-bd1b-4f2b-ba68-ed405a611073\") " pod="openshift-controller-manager/controller-manager-6f8485cb6-dbkm2" Feb 16 19:47:17 crc kubenswrapper[4675]: I0216 19:47:17.965402 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bf5ebe2b-bd1b-4f2b-ba68-ed405a611073-proxy-ca-bundles\") pod \"controller-manager-6f8485cb6-dbkm2\" (UID: \"bf5ebe2b-bd1b-4f2b-ba68-ed405a611073\") " pod="openshift-controller-manager/controller-manager-6f8485cb6-dbkm2" Feb 16 19:47:17 crc kubenswrapper[4675]: I0216 19:47:17.965431 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxpcf\" (UniqueName: \"kubernetes.io/projected/bf5ebe2b-bd1b-4f2b-ba68-ed405a611073-kube-api-access-zxpcf\") pod \"controller-manager-6f8485cb6-dbkm2\" (UID: \"bf5ebe2b-bd1b-4f2b-ba68-ed405a611073\") " pod="openshift-controller-manager/controller-manager-6f8485cb6-dbkm2" Feb 16 19:47:17 crc kubenswrapper[4675]: I0216 19:47:17.965494 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf5ebe2b-bd1b-4f2b-ba68-ed405a611073-config\") pod \"controller-manager-6f8485cb6-dbkm2\" (UID: \"bf5ebe2b-bd1b-4f2b-ba68-ed405a611073\") " pod="openshift-controller-manager/controller-manager-6f8485cb6-dbkm2" Feb 16 19:47:17 crc kubenswrapper[4675]: I0216 19:47:17.965529 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf5ebe2b-bd1b-4f2b-ba68-ed405a611073-serving-cert\") pod \"controller-manager-6f8485cb6-dbkm2\" (UID: \"bf5ebe2b-bd1b-4f2b-ba68-ed405a611073\") " pod="openshift-controller-manager/controller-manager-6f8485cb6-dbkm2" Feb 16 19:47:17 crc kubenswrapper[4675]: I0216 19:47:17.966492 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bf5ebe2b-bd1b-4f2b-ba68-ed405a611073-client-ca\") pod \"controller-manager-6f8485cb6-dbkm2\" (UID: \"bf5ebe2b-bd1b-4f2b-ba68-ed405a611073\") " pod="openshift-controller-manager/controller-manager-6f8485cb6-dbkm2" Feb 16 19:47:17 crc kubenswrapper[4675]: I0216 19:47:17.966824 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bf5ebe2b-bd1b-4f2b-ba68-ed405a611073-proxy-ca-bundles\") pod \"controller-manager-6f8485cb6-dbkm2\" (UID: \"bf5ebe2b-bd1b-4f2b-ba68-ed405a611073\") " pod="openshift-controller-manager/controller-manager-6f8485cb6-dbkm2" Feb 16 19:47:17 crc kubenswrapper[4675]: I0216 19:47:17.967534 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf5ebe2b-bd1b-4f2b-ba68-ed405a611073-config\") pod \"controller-manager-6f8485cb6-dbkm2\" (UID: \"bf5ebe2b-bd1b-4f2b-ba68-ed405a611073\") " pod="openshift-controller-manager/controller-manager-6f8485cb6-dbkm2" Feb 16 19:47:17 crc kubenswrapper[4675]: I0216 19:47:17.980345 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf5ebe2b-bd1b-4f2b-ba68-ed405a611073-serving-cert\") pod \"controller-manager-6f8485cb6-dbkm2\" (UID: \"bf5ebe2b-bd1b-4f2b-ba68-ed405a611073\") " pod="openshift-controller-manager/controller-manager-6f8485cb6-dbkm2" Feb 16 19:47:17 crc kubenswrapper[4675]: I0216 19:47:17.986748 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxpcf\" (UniqueName: \"kubernetes.io/projected/bf5ebe2b-bd1b-4f2b-ba68-ed405a611073-kube-api-access-zxpcf\") pod \"controller-manager-6f8485cb6-dbkm2\" (UID: \"bf5ebe2b-bd1b-4f2b-ba68-ed405a611073\") " pod="openshift-controller-manager/controller-manager-6f8485cb6-dbkm2" Feb 16 19:47:18 crc kubenswrapper[4675]: I0216 19:47:18.041797 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f8485cb6-dbkm2" Feb 16 19:47:18 crc kubenswrapper[4675]: I0216 19:47:18.308460 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6f8485cb6-dbkm2"] Feb 16 19:47:18 crc kubenswrapper[4675]: W0216 19:47:18.314384 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf5ebe2b_bd1b_4f2b_ba68_ed405a611073.slice/crio-cd72e21f12c13267f65fd7f3be674891fa43cdfe506732a6b7dff71ba9f03ee4 WatchSource:0}: Error finding container cd72e21f12c13267f65fd7f3be674891fa43cdfe506732a6b7dff71ba9f03ee4: Status 404 returned error can't find the container with id cd72e21f12c13267f65fd7f3be674891fa43cdfe506732a6b7dff71ba9f03ee4 Feb 16 19:47:18 crc kubenswrapper[4675]: I0216 19:47:18.865326 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6f8485cb6-dbkm2" event={"ID":"bf5ebe2b-bd1b-4f2b-ba68-ed405a611073","Type":"ContainerStarted","Data":"310481762cb4fb6f9968c5f0f7178ac97fcf77348f28a95879e3843f5d5c24ce"} Feb 16 19:47:18 crc kubenswrapper[4675]: I0216 19:47:18.865398 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6f8485cb6-dbkm2" event={"ID":"bf5ebe2b-bd1b-4f2b-ba68-ed405a611073","Type":"ContainerStarted","Data":"cd72e21f12c13267f65fd7f3be674891fa43cdfe506732a6b7dff71ba9f03ee4"} Feb 16 19:47:18 crc kubenswrapper[4675]: I0216 19:47:18.865758 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6f8485cb6-dbkm2" Feb 16 19:47:18 crc kubenswrapper[4675]: I0216 19:47:18.871247 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6f8485cb6-dbkm2" Feb 16 19:47:18 crc kubenswrapper[4675]: I0216 19:47:18.886590 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6f8485cb6-dbkm2" podStartSLOduration=2.886555064 podStartE2EDuration="2.886555064s" podCreationTimestamp="2026-02-16 19:47:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 19:47:18.882548039 +0000 UTC m=+322.007837605" watchObservedRunningTime="2026-02-16 19:47:18.886555064 +0000 UTC m=+322.011844620" Feb 16 19:47:19 crc kubenswrapper[4675]: I0216 19:47:19.895972 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf76ff91-7c18-4616-ba44-34c4efd8a97e" path="/var/lib/kubelet/pods/cf76ff91-7c18-4616-ba44-34c4efd8a97e/volumes" Feb 16 19:47:47 crc kubenswrapper[4675]: I0216 19:47:47.553991 4675 patch_prober.go:28] interesting pod/machine-config-daemon-j7pnb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 16 19:47:47 crc kubenswrapper[4675]: I0216 19:47:47.554795 4675 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" podUID="10414964-83d0-4d95-a89f-e3212a8015b5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 16 19:47:50 crc kubenswrapper[4675]: I0216 19:47:50.706698 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f5c4ff7d7-bxzn7"] Feb 16 19:47:50 crc kubenswrapper[4675]: I0216 19:47:50.706989 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-f5c4ff7d7-bxzn7" podUID="87afba8c-5880-44e5-9c75-c628a1b82d5d" containerName="route-controller-manager" containerID="cri-o://cd56e944e240207fdce8f2bff59ae1329793a921c2ef9a96d460c7a39eaff84b" gracePeriod=30 Feb 16 19:47:51 crc kubenswrapper[4675]: I0216 19:47:51.069341 4675 generic.go:334] "Generic (PLEG): container finished" podID="87afba8c-5880-44e5-9c75-c628a1b82d5d" containerID="cd56e944e240207fdce8f2bff59ae1329793a921c2ef9a96d460c7a39eaff84b" exitCode=0 Feb 16 19:47:51 crc kubenswrapper[4675]: I0216 19:47:51.069793 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f5c4ff7d7-bxzn7" event={"ID":"87afba8c-5880-44e5-9c75-c628a1b82d5d","Type":"ContainerDied","Data":"cd56e944e240207fdce8f2bff59ae1329793a921c2ef9a96d460c7a39eaff84b"} Feb 16 19:47:51 crc kubenswrapper[4675]: I0216 19:47:51.150093 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f5c4ff7d7-bxzn7" Feb 16 19:47:51 crc kubenswrapper[4675]: I0216 19:47:51.265576 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/87afba8c-5880-44e5-9c75-c628a1b82d5d-client-ca\") pod \"87afba8c-5880-44e5-9c75-c628a1b82d5d\" (UID: \"87afba8c-5880-44e5-9c75-c628a1b82d5d\") " Feb 16 19:47:51 crc kubenswrapper[4675]: I0216 19:47:51.265651 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87afba8c-5880-44e5-9c75-c628a1b82d5d-config\") pod \"87afba8c-5880-44e5-9c75-c628a1b82d5d\" (UID: \"87afba8c-5880-44e5-9c75-c628a1b82d5d\") " Feb 16 19:47:51 crc kubenswrapper[4675]: I0216 19:47:51.265676 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xk479\" (UniqueName: \"kubernetes.io/projected/87afba8c-5880-44e5-9c75-c628a1b82d5d-kube-api-access-xk479\") pod \"87afba8c-5880-44e5-9c75-c628a1b82d5d\" (UID: \"87afba8c-5880-44e5-9c75-c628a1b82d5d\") " Feb 16 19:47:51 crc kubenswrapper[4675]: I0216 19:47:51.265748 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/87afba8c-5880-44e5-9c75-c628a1b82d5d-serving-cert\") pod \"87afba8c-5880-44e5-9c75-c628a1b82d5d\" (UID: \"87afba8c-5880-44e5-9c75-c628a1b82d5d\") " Feb 16 19:47:51 crc kubenswrapper[4675]: I0216 19:47:51.266702 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87afba8c-5880-44e5-9c75-c628a1b82d5d-client-ca" (OuterVolumeSpecName: "client-ca") pod "87afba8c-5880-44e5-9c75-c628a1b82d5d" (UID: "87afba8c-5880-44e5-9c75-c628a1b82d5d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 19:47:51 crc kubenswrapper[4675]: I0216 19:47:51.267427 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87afba8c-5880-44e5-9c75-c628a1b82d5d-config" (OuterVolumeSpecName: "config") pod "87afba8c-5880-44e5-9c75-c628a1b82d5d" (UID: "87afba8c-5880-44e5-9c75-c628a1b82d5d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 19:47:51 crc kubenswrapper[4675]: I0216 19:47:51.271936 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87afba8c-5880-44e5-9c75-c628a1b82d5d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "87afba8c-5880-44e5-9c75-c628a1b82d5d" (UID: "87afba8c-5880-44e5-9c75-c628a1b82d5d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 19:47:51 crc kubenswrapper[4675]: I0216 19:47:51.272037 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87afba8c-5880-44e5-9c75-c628a1b82d5d-kube-api-access-xk479" (OuterVolumeSpecName: "kube-api-access-xk479") pod "87afba8c-5880-44e5-9c75-c628a1b82d5d" (UID: "87afba8c-5880-44e5-9c75-c628a1b82d5d"). InnerVolumeSpecName "kube-api-access-xk479". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 19:47:51 crc kubenswrapper[4675]: I0216 19:47:51.367614 4675 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/87afba8c-5880-44e5-9c75-c628a1b82d5d-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 16 19:47:51 crc kubenswrapper[4675]: I0216 19:47:51.367660 4675 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/87afba8c-5880-44e5-9c75-c628a1b82d5d-client-ca\") on node \"crc\" DevicePath \"\"" Feb 16 19:47:51 crc kubenswrapper[4675]: I0216 19:47:51.367671 4675 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87afba8c-5880-44e5-9c75-c628a1b82d5d-config\") on node \"crc\" DevicePath \"\"" Feb 16 19:47:51 crc kubenswrapper[4675]: I0216 19:47:51.367744 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xk479\" (UniqueName: \"kubernetes.io/projected/87afba8c-5880-44e5-9c75-c628a1b82d5d-kube-api-access-xk479\") on node \"crc\" DevicePath \"\"" Feb 16 19:47:52 crc kubenswrapper[4675]: I0216 19:47:52.076974 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f5c4ff7d7-bxzn7" event={"ID":"87afba8c-5880-44e5-9c75-c628a1b82d5d","Type":"ContainerDied","Data":"c1580b693833d806098ad3cd09a7aadd6193347263df99a0990193b3699af86a"} Feb 16 19:47:52 crc kubenswrapper[4675]: I0216 19:47:52.077029 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f5c4ff7d7-bxzn7" Feb 16 19:47:52 crc kubenswrapper[4675]: I0216 19:47:52.077821 4675 scope.go:117] "RemoveContainer" containerID="cd56e944e240207fdce8f2bff59ae1329793a921c2ef9a96d460c7a39eaff84b" Feb 16 19:47:52 crc kubenswrapper[4675]: I0216 19:47:52.098534 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f5c4ff7d7-bxzn7"] Feb 16 19:47:52 crc kubenswrapper[4675]: I0216 19:47:52.101748 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f5c4ff7d7-bxzn7"] Feb 16 19:47:52 crc kubenswrapper[4675]: I0216 19:47:52.732083 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-84b69847c4-9lldc"] Feb 16 19:47:52 crc kubenswrapper[4675]: E0216 19:47:52.732396 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87afba8c-5880-44e5-9c75-c628a1b82d5d" containerName="route-controller-manager" Feb 16 19:47:52 crc kubenswrapper[4675]: I0216 19:47:52.732414 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="87afba8c-5880-44e5-9c75-c628a1b82d5d" containerName="route-controller-manager" Feb 16 19:47:52 crc kubenswrapper[4675]: I0216 19:47:52.732546 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="87afba8c-5880-44e5-9c75-c628a1b82d5d" containerName="route-controller-manager" Feb 16 19:47:52 crc kubenswrapper[4675]: I0216 19:47:52.733124 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-84b69847c4-9lldc" Feb 16 19:47:52 crc kubenswrapper[4675]: I0216 19:47:52.743599 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 16 19:47:52 crc kubenswrapper[4675]: I0216 19:47:52.743736 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 16 19:47:52 crc kubenswrapper[4675]: I0216 19:47:52.744702 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 16 19:47:52 crc kubenswrapper[4675]: I0216 19:47:52.744762 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 16 19:47:52 crc kubenswrapper[4675]: I0216 19:47:52.744990 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 16 19:47:52 crc kubenswrapper[4675]: I0216 19:47:52.746737 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 16 19:47:52 crc kubenswrapper[4675]: I0216 19:47:52.750436 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-84b69847c4-9lldc"] Feb 16 19:47:52 crc kubenswrapper[4675]: I0216 19:47:52.891169 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/06e64af2-9bfe-49a4-ad52-e9cca2762a9f-serving-cert\") pod \"route-controller-manager-84b69847c4-9lldc\" (UID: \"06e64af2-9bfe-49a4-ad52-e9cca2762a9f\") " pod="openshift-route-controller-manager/route-controller-manager-84b69847c4-9lldc" Feb 16 19:47:52 crc kubenswrapper[4675]: I0216 19:47:52.891313 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06e64af2-9bfe-49a4-ad52-e9cca2762a9f-config\") pod \"route-controller-manager-84b69847c4-9lldc\" (UID: \"06e64af2-9bfe-49a4-ad52-e9cca2762a9f\") " pod="openshift-route-controller-manager/route-controller-manager-84b69847c4-9lldc" Feb 16 19:47:52 crc kubenswrapper[4675]: I0216 19:47:52.891352 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q68gg\" (UniqueName: \"kubernetes.io/projected/06e64af2-9bfe-49a4-ad52-e9cca2762a9f-kube-api-access-q68gg\") pod \"route-controller-manager-84b69847c4-9lldc\" (UID: \"06e64af2-9bfe-49a4-ad52-e9cca2762a9f\") " pod="openshift-route-controller-manager/route-controller-manager-84b69847c4-9lldc" Feb 16 19:47:52 crc kubenswrapper[4675]: I0216 19:47:52.891387 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/06e64af2-9bfe-49a4-ad52-e9cca2762a9f-client-ca\") pod \"route-controller-manager-84b69847c4-9lldc\" (UID: \"06e64af2-9bfe-49a4-ad52-e9cca2762a9f\") " pod="openshift-route-controller-manager/route-controller-manager-84b69847c4-9lldc" Feb 16 19:47:52 crc kubenswrapper[4675]: I0216 19:47:52.993091 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q68gg\" (UniqueName: \"kubernetes.io/projected/06e64af2-9bfe-49a4-ad52-e9cca2762a9f-kube-api-access-q68gg\") pod \"route-controller-manager-84b69847c4-9lldc\" (UID: \"06e64af2-9bfe-49a4-ad52-e9cca2762a9f\") " pod="openshift-route-controller-manager/route-controller-manager-84b69847c4-9lldc" Feb 16 19:47:52 crc kubenswrapper[4675]: I0216 19:47:52.993167 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/06e64af2-9bfe-49a4-ad52-e9cca2762a9f-client-ca\") pod \"route-controller-manager-84b69847c4-9lldc\" (UID: \"06e64af2-9bfe-49a4-ad52-e9cca2762a9f\") " pod="openshift-route-controller-manager/route-controller-manager-84b69847c4-9lldc" Feb 16 19:47:52 crc kubenswrapper[4675]: I0216 19:47:52.993291 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/06e64af2-9bfe-49a4-ad52-e9cca2762a9f-serving-cert\") pod \"route-controller-manager-84b69847c4-9lldc\" (UID: \"06e64af2-9bfe-49a4-ad52-e9cca2762a9f\") " pod="openshift-route-controller-manager/route-controller-manager-84b69847c4-9lldc" Feb 16 19:47:52 crc kubenswrapper[4675]: I0216 19:47:52.993333 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06e64af2-9bfe-49a4-ad52-e9cca2762a9f-config\") pod \"route-controller-manager-84b69847c4-9lldc\" (UID: \"06e64af2-9bfe-49a4-ad52-e9cca2762a9f\") " pod="openshift-route-controller-manager/route-controller-manager-84b69847c4-9lldc" Feb 16 19:47:52 crc kubenswrapper[4675]: I0216 19:47:52.994779 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/06e64af2-9bfe-49a4-ad52-e9cca2762a9f-client-ca\") pod \"route-controller-manager-84b69847c4-9lldc\" (UID: \"06e64af2-9bfe-49a4-ad52-e9cca2762a9f\") " pod="openshift-route-controller-manager/route-controller-manager-84b69847c4-9lldc" Feb 16 19:47:52 crc kubenswrapper[4675]: I0216 19:47:52.995335 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06e64af2-9bfe-49a4-ad52-e9cca2762a9f-config\") pod \"route-controller-manager-84b69847c4-9lldc\" (UID: \"06e64af2-9bfe-49a4-ad52-e9cca2762a9f\") " pod="openshift-route-controller-manager/route-controller-manager-84b69847c4-9lldc" Feb 16 19:47:52 crc kubenswrapper[4675]: I0216 19:47:52.999027 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/06e64af2-9bfe-49a4-ad52-e9cca2762a9f-serving-cert\") pod \"route-controller-manager-84b69847c4-9lldc\" (UID: \"06e64af2-9bfe-49a4-ad52-e9cca2762a9f\") " pod="openshift-route-controller-manager/route-controller-manager-84b69847c4-9lldc" Feb 16 19:47:53 crc kubenswrapper[4675]: I0216 19:47:53.015594 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q68gg\" (UniqueName: \"kubernetes.io/projected/06e64af2-9bfe-49a4-ad52-e9cca2762a9f-kube-api-access-q68gg\") pod \"route-controller-manager-84b69847c4-9lldc\" (UID: \"06e64af2-9bfe-49a4-ad52-e9cca2762a9f\") " pod="openshift-route-controller-manager/route-controller-manager-84b69847c4-9lldc" Feb 16 19:47:53 crc kubenswrapper[4675]: I0216 19:47:53.060597 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-84b69847c4-9lldc" Feb 16 19:47:53 crc kubenswrapper[4675]: I0216 19:47:53.497969 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-84b69847c4-9lldc"] Feb 16 19:47:53 crc kubenswrapper[4675]: I0216 19:47:53.891427 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87afba8c-5880-44e5-9c75-c628a1b82d5d" path="/var/lib/kubelet/pods/87afba8c-5880-44e5-9c75-c628a1b82d5d/volumes" Feb 16 19:47:54 crc kubenswrapper[4675]: I0216 19:47:54.091283 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-84b69847c4-9lldc" event={"ID":"06e64af2-9bfe-49a4-ad52-e9cca2762a9f","Type":"ContainerStarted","Data":"95b34d7a7b8cfafceedfe779a7cd7a643ee6cbb53315e95b242693c4c117b361"} Feb 16 19:47:54 crc kubenswrapper[4675]: I0216 19:47:54.091815 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-84b69847c4-9lldc" event={"ID":"06e64af2-9bfe-49a4-ad52-e9cca2762a9f","Type":"ContainerStarted","Data":"5a96ac57979dddeb9301599413c51ba201f8fb19d52276d8665cb175736c18a2"} Feb 16 19:47:54 crc kubenswrapper[4675]: I0216 19:47:54.091838 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-84b69847c4-9lldc" Feb 16 19:47:54 crc kubenswrapper[4675]: I0216 19:47:54.097609 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-84b69847c4-9lldc" Feb 16 19:47:54 crc kubenswrapper[4675]: I0216 19:47:54.131548 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-84b69847c4-9lldc" podStartSLOduration=4.131522373 podStartE2EDuration="4.131522373s" podCreationTimestamp="2026-02-16 19:47:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 19:47:54.114126458 +0000 UTC m=+357.239416024" watchObservedRunningTime="2026-02-16 19:47:54.131522373 +0000 UTC m=+357.256811929" Feb 16 19:48:04 crc kubenswrapper[4675]: I0216 19:48:04.833481 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rnmhj"] Feb 16 19:48:04 crc kubenswrapper[4675]: I0216 19:48:04.835496 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rnmhj" Feb 16 19:48:04 crc kubenswrapper[4675]: I0216 19:48:04.837466 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 16 19:48:04 crc kubenswrapper[4675]: I0216 19:48:04.848780 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rnmhj"] Feb 16 19:48:04 crc kubenswrapper[4675]: I0216 19:48:04.982779 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtvvs\" (UniqueName: \"kubernetes.io/projected/e3ad6b2a-aa4d-45e9-8931-019ca2d29e19-kube-api-access-vtvvs\") pod \"redhat-marketplace-rnmhj\" (UID: \"e3ad6b2a-aa4d-45e9-8931-019ca2d29e19\") " pod="openshift-marketplace/redhat-marketplace-rnmhj" Feb 16 19:48:04 crc kubenswrapper[4675]: I0216 19:48:04.982851 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3ad6b2a-aa4d-45e9-8931-019ca2d29e19-catalog-content\") pod \"redhat-marketplace-rnmhj\" (UID: \"e3ad6b2a-aa4d-45e9-8931-019ca2d29e19\") " pod="openshift-marketplace/redhat-marketplace-rnmhj" Feb 16 19:48:04 crc kubenswrapper[4675]: I0216 19:48:04.982966 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3ad6b2a-aa4d-45e9-8931-019ca2d29e19-utilities\") pod \"redhat-marketplace-rnmhj\" (UID: \"e3ad6b2a-aa4d-45e9-8931-019ca2d29e19\") " pod="openshift-marketplace/redhat-marketplace-rnmhj" Feb 16 19:48:05 crc kubenswrapper[4675]: I0216 19:48:05.084290 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtvvs\" (UniqueName: \"kubernetes.io/projected/e3ad6b2a-aa4d-45e9-8931-019ca2d29e19-kube-api-access-vtvvs\") pod \"redhat-marketplace-rnmhj\" (UID: \"e3ad6b2a-aa4d-45e9-8931-019ca2d29e19\") " pod="openshift-marketplace/redhat-marketplace-rnmhj" Feb 16 19:48:05 crc kubenswrapper[4675]: I0216 19:48:05.084391 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3ad6b2a-aa4d-45e9-8931-019ca2d29e19-catalog-content\") pod \"redhat-marketplace-rnmhj\" (UID: \"e3ad6b2a-aa4d-45e9-8931-019ca2d29e19\") " pod="openshift-marketplace/redhat-marketplace-rnmhj" Feb 16 19:48:05 crc kubenswrapper[4675]: I0216 19:48:05.084468 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3ad6b2a-aa4d-45e9-8931-019ca2d29e19-utilities\") pod \"redhat-marketplace-rnmhj\" (UID: \"e3ad6b2a-aa4d-45e9-8931-019ca2d29e19\") " pod="openshift-marketplace/redhat-marketplace-rnmhj" Feb 16 19:48:05 crc kubenswrapper[4675]: I0216 19:48:05.085060 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3ad6b2a-aa4d-45e9-8931-019ca2d29e19-utilities\") pod \"redhat-marketplace-rnmhj\" (UID: \"e3ad6b2a-aa4d-45e9-8931-019ca2d29e19\") " pod="openshift-marketplace/redhat-marketplace-rnmhj" Feb 16 19:48:05 crc kubenswrapper[4675]: I0216 19:48:05.085234 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3ad6b2a-aa4d-45e9-8931-019ca2d29e19-catalog-content\") pod \"redhat-marketplace-rnmhj\" (UID: \"e3ad6b2a-aa4d-45e9-8931-019ca2d29e19\") " pod="openshift-marketplace/redhat-marketplace-rnmhj" Feb 16 19:48:05 crc kubenswrapper[4675]: I0216 19:48:05.107414 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtvvs\" (UniqueName: \"kubernetes.io/projected/e3ad6b2a-aa4d-45e9-8931-019ca2d29e19-kube-api-access-vtvvs\") pod \"redhat-marketplace-rnmhj\" (UID: \"e3ad6b2a-aa4d-45e9-8931-019ca2d29e19\") " pod="openshift-marketplace/redhat-marketplace-rnmhj" Feb 16 19:48:05 crc kubenswrapper[4675]: I0216 19:48:05.160978 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rnmhj" Feb 16 19:48:05 crc kubenswrapper[4675]: I0216 19:48:05.434330 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xxn99"] Feb 16 19:48:05 crc kubenswrapper[4675]: I0216 19:48:05.437259 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xxn99" Feb 16 19:48:05 crc kubenswrapper[4675]: I0216 19:48:05.442120 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 16 19:48:05 crc kubenswrapper[4675]: I0216 19:48:05.448502 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xxn99"] Feb 16 19:48:05 crc kubenswrapper[4675]: I0216 19:48:05.593768 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gclcr\" (UniqueName: \"kubernetes.io/projected/fa11d522-5f3a-456b-b8f4-9e60fd4ad519-kube-api-access-gclcr\") pod \"community-operators-xxn99\" (UID: \"fa11d522-5f3a-456b-b8f4-9e60fd4ad519\") " pod="openshift-marketplace/community-operators-xxn99" Feb 16 19:48:05 crc kubenswrapper[4675]: I0216 19:48:05.593886 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa11d522-5f3a-456b-b8f4-9e60fd4ad519-catalog-content\") pod \"community-operators-xxn99\" (UID: \"fa11d522-5f3a-456b-b8f4-9e60fd4ad519\") " pod="openshift-marketplace/community-operators-xxn99" Feb 16 19:48:05 crc kubenswrapper[4675]: I0216 19:48:05.594099 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa11d522-5f3a-456b-b8f4-9e60fd4ad519-utilities\") pod \"community-operators-xxn99\" (UID: \"fa11d522-5f3a-456b-b8f4-9e60fd4ad519\") " pod="openshift-marketplace/community-operators-xxn99" Feb 16 19:48:05 crc kubenswrapper[4675]: I0216 19:48:05.651986 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rnmhj"] Feb 16 19:48:05 crc kubenswrapper[4675]: I0216 19:48:05.695939 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa11d522-5f3a-456b-b8f4-9e60fd4ad519-utilities\") pod \"community-operators-xxn99\" (UID: \"fa11d522-5f3a-456b-b8f4-9e60fd4ad519\") " pod="openshift-marketplace/community-operators-xxn99" Feb 16 19:48:05 crc kubenswrapper[4675]: I0216 19:48:05.696212 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gclcr\" (UniqueName: \"kubernetes.io/projected/fa11d522-5f3a-456b-b8f4-9e60fd4ad519-kube-api-access-gclcr\") pod \"community-operators-xxn99\" (UID: \"fa11d522-5f3a-456b-b8f4-9e60fd4ad519\") " pod="openshift-marketplace/community-operators-xxn99" Feb 16 19:48:05 crc kubenswrapper[4675]: I0216 19:48:05.696358 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa11d522-5f3a-456b-b8f4-9e60fd4ad519-catalog-content\") pod \"community-operators-xxn99\" (UID: \"fa11d522-5f3a-456b-b8f4-9e60fd4ad519\") " pod="openshift-marketplace/community-operators-xxn99" Feb 16 19:48:05 crc kubenswrapper[4675]: I0216 19:48:05.697186 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa11d522-5f3a-456b-b8f4-9e60fd4ad519-catalog-content\") pod \"community-operators-xxn99\" (UID: \"fa11d522-5f3a-456b-b8f4-9e60fd4ad519\") " pod="openshift-marketplace/community-operators-xxn99" Feb 16 19:48:05 crc kubenswrapper[4675]: I0216 19:48:05.697542 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa11d522-5f3a-456b-b8f4-9e60fd4ad519-utilities\") pod \"community-operators-xxn99\" (UID: \"fa11d522-5f3a-456b-b8f4-9e60fd4ad519\") " pod="openshift-marketplace/community-operators-xxn99" Feb 16 19:48:05 crc kubenswrapper[4675]: I0216 19:48:05.731486 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gclcr\" (UniqueName: \"kubernetes.io/projected/fa11d522-5f3a-456b-b8f4-9e60fd4ad519-kube-api-access-gclcr\") pod \"community-operators-xxn99\" (UID: \"fa11d522-5f3a-456b-b8f4-9e60fd4ad519\") " pod="openshift-marketplace/community-operators-xxn99" Feb 16 19:48:06 crc kubenswrapper[4675]: I0216 19:48:06.169580 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rnmhj" event={"ID":"e3ad6b2a-aa4d-45e9-8931-019ca2d29e19","Type":"ContainerStarted","Data":"75eaecec068d2d99c8e54cd5cd9fae1004523c4ddd2f92eda54ca22863aa24e5"} Feb 16 19:48:06 crc kubenswrapper[4675]: I0216 19:48:06.814895 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xxn99" Feb 16 19:48:07 crc kubenswrapper[4675]: I0216 19:48:07.176795 4675 generic.go:334] "Generic (PLEG): container finished" podID="e3ad6b2a-aa4d-45e9-8931-019ca2d29e19" containerID="2d46506f6e80c9899bd3fa7813ebb067121e5f2f79448e9494f58aa8487948b0" exitCode=0 Feb 16 19:48:07 crc kubenswrapper[4675]: I0216 19:48:07.176895 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rnmhj" event={"ID":"e3ad6b2a-aa4d-45e9-8931-019ca2d29e19","Type":"ContainerDied","Data":"2d46506f6e80c9899bd3fa7813ebb067121e5f2f79448e9494f58aa8487948b0"} Feb 16 19:48:07 crc kubenswrapper[4675]: I0216 19:48:07.230155 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wdjm9"] Feb 16 19:48:07 crc kubenswrapper[4675]: I0216 19:48:07.231286 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wdjm9" Feb 16 19:48:07 crc kubenswrapper[4675]: I0216 19:48:07.233875 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 16 19:48:07 crc kubenswrapper[4675]: I0216 19:48:07.255072 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wdjm9"] Feb 16 19:48:07 crc kubenswrapper[4675]: I0216 19:48:07.273964 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xxn99"] Feb 16 19:48:07 crc kubenswrapper[4675]: I0216 19:48:07.322297 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe82aa8d-5f71-47cf-9c1f-6b86a66fe07b-catalog-content\") pod \"redhat-operators-wdjm9\" (UID: \"fe82aa8d-5f71-47cf-9c1f-6b86a66fe07b\") " pod="openshift-marketplace/redhat-operators-wdjm9" Feb 16 19:48:07 crc kubenswrapper[4675]: I0216 19:48:07.322743 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgth2\" (UniqueName: \"kubernetes.io/projected/fe82aa8d-5f71-47cf-9c1f-6b86a66fe07b-kube-api-access-mgth2\") pod \"redhat-operators-wdjm9\" (UID: \"fe82aa8d-5f71-47cf-9c1f-6b86a66fe07b\") " pod="openshift-marketplace/redhat-operators-wdjm9" Feb 16 19:48:07 crc kubenswrapper[4675]: I0216 19:48:07.322781 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe82aa8d-5f71-47cf-9c1f-6b86a66fe07b-utilities\") pod \"redhat-operators-wdjm9\" (UID: \"fe82aa8d-5f71-47cf-9c1f-6b86a66fe07b\") " pod="openshift-marketplace/redhat-operators-wdjm9" Feb 16 19:48:07 crc kubenswrapper[4675]: I0216 19:48:07.423950 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe82aa8d-5f71-47cf-9c1f-6b86a66fe07b-catalog-content\") pod \"redhat-operators-wdjm9\" (UID: \"fe82aa8d-5f71-47cf-9c1f-6b86a66fe07b\") " pod="openshift-marketplace/redhat-operators-wdjm9" Feb 16 19:48:07 crc kubenswrapper[4675]: I0216 19:48:07.424048 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgth2\" (UniqueName: \"kubernetes.io/projected/fe82aa8d-5f71-47cf-9c1f-6b86a66fe07b-kube-api-access-mgth2\") pod \"redhat-operators-wdjm9\" (UID: \"fe82aa8d-5f71-47cf-9c1f-6b86a66fe07b\") " pod="openshift-marketplace/redhat-operators-wdjm9" Feb 16 19:48:07 crc kubenswrapper[4675]: I0216 19:48:07.424097 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe82aa8d-5f71-47cf-9c1f-6b86a66fe07b-utilities\") pod \"redhat-operators-wdjm9\" (UID: \"fe82aa8d-5f71-47cf-9c1f-6b86a66fe07b\") " pod="openshift-marketplace/redhat-operators-wdjm9" Feb 16 19:48:07 crc kubenswrapper[4675]: I0216 19:48:07.424593 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe82aa8d-5f71-47cf-9c1f-6b86a66fe07b-utilities\") pod \"redhat-operators-wdjm9\" (UID: \"fe82aa8d-5f71-47cf-9c1f-6b86a66fe07b\") " pod="openshift-marketplace/redhat-operators-wdjm9" Feb 16 19:48:07 crc kubenswrapper[4675]: I0216 19:48:07.424629 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe82aa8d-5f71-47cf-9c1f-6b86a66fe07b-catalog-content\") pod \"redhat-operators-wdjm9\" (UID: \"fe82aa8d-5f71-47cf-9c1f-6b86a66fe07b\") " pod="openshift-marketplace/redhat-operators-wdjm9" Feb 16 19:48:07 crc kubenswrapper[4675]: I0216 19:48:07.444433 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgth2\" (UniqueName: \"kubernetes.io/projected/fe82aa8d-5f71-47cf-9c1f-6b86a66fe07b-kube-api-access-mgth2\") pod \"redhat-operators-wdjm9\" (UID: \"fe82aa8d-5f71-47cf-9c1f-6b86a66fe07b\") " pod="openshift-marketplace/redhat-operators-wdjm9" Feb 16 19:48:07 crc kubenswrapper[4675]: I0216 19:48:07.547760 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wdjm9" Feb 16 19:48:07 crc kubenswrapper[4675]: I0216 19:48:07.830220 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8hb2k"] Feb 16 19:48:07 crc kubenswrapper[4675]: I0216 19:48:07.833883 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8hb2k" Feb 16 19:48:07 crc kubenswrapper[4675]: I0216 19:48:07.840950 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 16 19:48:07 crc kubenswrapper[4675]: I0216 19:48:07.853934 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8hb2k"] Feb 16 19:48:07 crc kubenswrapper[4675]: I0216 19:48:07.875255 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wdjm9"] Feb 16 19:48:07 crc kubenswrapper[4675]: W0216 19:48:07.882796 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe82aa8d_5f71_47cf_9c1f_6b86a66fe07b.slice/crio-f67eab4caddaff391c55e7e75e53d104f6992ef9c6c6401f41ecb7ff44eaf822 WatchSource:0}: Error finding container f67eab4caddaff391c55e7e75e53d104f6992ef9c6c6401f41ecb7ff44eaf822: Status 404 returned error can't find the container with id f67eab4caddaff391c55e7e75e53d104f6992ef9c6c6401f41ecb7ff44eaf822 Feb 16 19:48:07 crc kubenswrapper[4675]: I0216 19:48:07.941051 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1984d92c-2f8f-431e-9006-2a8e14bad660-catalog-content\") pod \"certified-operators-8hb2k\" (UID: \"1984d92c-2f8f-431e-9006-2a8e14bad660\") " pod="openshift-marketplace/certified-operators-8hb2k" Feb 16 19:48:07 crc kubenswrapper[4675]: I0216 19:48:07.941295 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1984d92c-2f8f-431e-9006-2a8e14bad660-utilities\") pod \"certified-operators-8hb2k\" (UID: \"1984d92c-2f8f-431e-9006-2a8e14bad660\") " pod="openshift-marketplace/certified-operators-8hb2k" Feb 16 19:48:07 crc kubenswrapper[4675]: I0216 19:48:07.941479 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gml4p\" (UniqueName: \"kubernetes.io/projected/1984d92c-2f8f-431e-9006-2a8e14bad660-kube-api-access-gml4p\") pod \"certified-operators-8hb2k\" (UID: \"1984d92c-2f8f-431e-9006-2a8e14bad660\") " pod="openshift-marketplace/certified-operators-8hb2k" Feb 16 19:48:08 crc kubenswrapper[4675]: I0216 19:48:08.043179 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1984d92c-2f8f-431e-9006-2a8e14bad660-catalog-content\") pod \"certified-operators-8hb2k\" (UID: \"1984d92c-2f8f-431e-9006-2a8e14bad660\") " pod="openshift-marketplace/certified-operators-8hb2k" Feb 16 19:48:08 crc kubenswrapper[4675]: I0216 19:48:08.043247 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1984d92c-2f8f-431e-9006-2a8e14bad660-utilities\") pod \"certified-operators-8hb2k\" (UID: \"1984d92c-2f8f-431e-9006-2a8e14bad660\") " pod="openshift-marketplace/certified-operators-8hb2k" Feb 16 19:48:08 crc kubenswrapper[4675]: I0216 19:48:08.043400 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gml4p\" (UniqueName: \"kubernetes.io/projected/1984d92c-2f8f-431e-9006-2a8e14bad660-kube-api-access-gml4p\") pod \"certified-operators-8hb2k\" (UID: \"1984d92c-2f8f-431e-9006-2a8e14bad660\") " pod="openshift-marketplace/certified-operators-8hb2k" Feb 16 19:48:08 crc kubenswrapper[4675]: I0216 19:48:08.043973 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1984d92c-2f8f-431e-9006-2a8e14bad660-catalog-content\") pod \"certified-operators-8hb2k\" (UID: \"1984d92c-2f8f-431e-9006-2a8e14bad660\") " pod="openshift-marketplace/certified-operators-8hb2k" Feb 16 19:48:08 crc kubenswrapper[4675]: I0216 19:48:08.044098 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1984d92c-2f8f-431e-9006-2a8e14bad660-utilities\") pod \"certified-operators-8hb2k\" (UID: \"1984d92c-2f8f-431e-9006-2a8e14bad660\") " pod="openshift-marketplace/certified-operators-8hb2k" Feb 16 19:48:08 crc kubenswrapper[4675]: I0216 19:48:08.066444 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gml4p\" (UniqueName: \"kubernetes.io/projected/1984d92c-2f8f-431e-9006-2a8e14bad660-kube-api-access-gml4p\") pod \"certified-operators-8hb2k\" (UID: \"1984d92c-2f8f-431e-9006-2a8e14bad660\") " pod="openshift-marketplace/certified-operators-8hb2k" Feb 16 19:48:08 crc kubenswrapper[4675]: I0216 19:48:08.164199 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8hb2k" Feb 16 19:48:08 crc kubenswrapper[4675]: I0216 19:48:08.183035 4675 generic.go:334] "Generic (PLEG): container finished" podID="fe82aa8d-5f71-47cf-9c1f-6b86a66fe07b" containerID="4c873ef7c31367c2c037375111f2a88b4f1c9ba7d7388b202a1518ade1a38563" exitCode=0 Feb 16 19:48:08 crc kubenswrapper[4675]: I0216 19:48:08.183120 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wdjm9" event={"ID":"fe82aa8d-5f71-47cf-9c1f-6b86a66fe07b","Type":"ContainerDied","Data":"4c873ef7c31367c2c037375111f2a88b4f1c9ba7d7388b202a1518ade1a38563"} Feb 16 19:48:08 crc kubenswrapper[4675]: I0216 19:48:08.183170 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wdjm9" event={"ID":"fe82aa8d-5f71-47cf-9c1f-6b86a66fe07b","Type":"ContainerStarted","Data":"f67eab4caddaff391c55e7e75e53d104f6992ef9c6c6401f41ecb7ff44eaf822"} Feb 16 19:48:08 crc kubenswrapper[4675]: I0216 19:48:08.208088 4675 generic.go:334] "Generic (PLEG): container finished" podID="fa11d522-5f3a-456b-b8f4-9e60fd4ad519" containerID="552caa8808bcf6671d9b3a20b77efbbb847229a98d395332b1f59c941d3f1454" exitCode=0 Feb 16 19:48:08 crc kubenswrapper[4675]: I0216 19:48:08.208202 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xxn99" event={"ID":"fa11d522-5f3a-456b-b8f4-9e60fd4ad519","Type":"ContainerDied","Data":"552caa8808bcf6671d9b3a20b77efbbb847229a98d395332b1f59c941d3f1454"} Feb 16 19:48:08 crc kubenswrapper[4675]: I0216 19:48:08.208237 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xxn99" event={"ID":"fa11d522-5f3a-456b-b8f4-9e60fd4ad519","Type":"ContainerStarted","Data":"2655d01005f5393aa8fc8bb0139a7b94fff206fe902c5ef61c8f7a2e8ea56ef7"} Feb 16 19:48:08 crc kubenswrapper[4675]: I0216 19:48:08.213213 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rnmhj" event={"ID":"e3ad6b2a-aa4d-45e9-8931-019ca2d29e19","Type":"ContainerStarted","Data":"25299af9b7d32fadd69706028c88f2854e1605d559efd2b6896a697a23c4214f"} Feb 16 19:48:08 crc kubenswrapper[4675]: I0216 19:48:08.420778 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8hb2k"] Feb 16 19:48:08 crc kubenswrapper[4675]: W0216 19:48:08.434272 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1984d92c_2f8f_431e_9006_2a8e14bad660.slice/crio-24b927a4145a3fb832b93a4e9b62e5e47f7c9b646933dc2c8800a167b293beb2 WatchSource:0}: Error finding container 24b927a4145a3fb832b93a4e9b62e5e47f7c9b646933dc2c8800a167b293beb2: Status 404 returned error can't find the container with id 24b927a4145a3fb832b93a4e9b62e5e47f7c9b646933dc2c8800a167b293beb2 Feb 16 19:48:09 crc kubenswrapper[4675]: I0216 19:48:09.225127 4675 generic.go:334] "Generic (PLEG): container finished" podID="e3ad6b2a-aa4d-45e9-8931-019ca2d29e19" containerID="25299af9b7d32fadd69706028c88f2854e1605d559efd2b6896a697a23c4214f" exitCode=0 Feb 16 19:48:09 crc kubenswrapper[4675]: I0216 19:48:09.225631 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rnmhj" event={"ID":"e3ad6b2a-aa4d-45e9-8931-019ca2d29e19","Type":"ContainerDied","Data":"25299af9b7d32fadd69706028c88f2854e1605d559efd2b6896a697a23c4214f"} Feb 16 19:48:09 crc kubenswrapper[4675]: I0216 19:48:09.227289 4675 generic.go:334] "Generic (PLEG): container finished" podID="1984d92c-2f8f-431e-9006-2a8e14bad660" containerID="8ef9e8b2d340c6342d422ce3fde16a743665fcfa5e27ba89f5578dc104430e36" exitCode=0 Feb 16 19:48:09 crc kubenswrapper[4675]: I0216 19:48:09.227349 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8hb2k" event={"ID":"1984d92c-2f8f-431e-9006-2a8e14bad660","Type":"ContainerDied","Data":"8ef9e8b2d340c6342d422ce3fde16a743665fcfa5e27ba89f5578dc104430e36"} Feb 16 19:48:09 crc kubenswrapper[4675]: I0216 19:48:09.227386 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8hb2k" event={"ID":"1984d92c-2f8f-431e-9006-2a8e14bad660","Type":"ContainerStarted","Data":"24b927a4145a3fb832b93a4e9b62e5e47f7c9b646933dc2c8800a167b293beb2"} Feb 16 19:48:09 crc kubenswrapper[4675]: I0216 19:48:09.237795 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wdjm9" event={"ID":"fe82aa8d-5f71-47cf-9c1f-6b86a66fe07b","Type":"ContainerStarted","Data":"8135cb7349e15385ba22216ff9ab8208657219cb89d7aca6597e542e866f6c1d"} Feb 16 19:48:09 crc kubenswrapper[4675]: I0216 19:48:09.241075 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xxn99" event={"ID":"fa11d522-5f3a-456b-b8f4-9e60fd4ad519","Type":"ContainerStarted","Data":"2f1b132eb6a9182eac5ce8dcc20eefb996271f418d86d227608b905465fe7b12"} Feb 16 19:48:09 crc kubenswrapper[4675]: I0216 19:48:09.302816 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-k4hgn"] Feb 16 19:48:09 crc kubenswrapper[4675]: I0216 19:48:09.304132 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-k4hgn" Feb 16 19:48:09 crc kubenswrapper[4675]: I0216 19:48:09.325660 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-k4hgn"] Feb 16 19:48:09 crc kubenswrapper[4675]: I0216 19:48:09.474713 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b31e6f23-b5b7-4d64-b256-e57b5f7c9d01-bound-sa-token\") pod \"image-registry-66df7c8f76-k4hgn\" (UID: \"b31e6f23-b5b7-4d64-b256-e57b5f7c9d01\") " pod="openshift-image-registry/image-registry-66df7c8f76-k4hgn" Feb 16 19:48:09 crc kubenswrapper[4675]: I0216 19:48:09.474837 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b31e6f23-b5b7-4d64-b256-e57b5f7c9d01-installation-pull-secrets\") pod \"image-registry-66df7c8f76-k4hgn\" (UID: \"b31e6f23-b5b7-4d64-b256-e57b5f7c9d01\") " pod="openshift-image-registry/image-registry-66df7c8f76-k4hgn" Feb 16 19:48:09 crc kubenswrapper[4675]: I0216 19:48:09.474872 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b31e6f23-b5b7-4d64-b256-e57b5f7c9d01-registry-certificates\") pod \"image-registry-66df7c8f76-k4hgn\" (UID: \"b31e6f23-b5b7-4d64-b256-e57b5f7c9d01\") " pod="openshift-image-registry/image-registry-66df7c8f76-k4hgn" Feb 16 19:48:09 crc kubenswrapper[4675]: I0216 19:48:09.474898 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xgrj\" (UniqueName: \"kubernetes.io/projected/b31e6f23-b5b7-4d64-b256-e57b5f7c9d01-kube-api-access-4xgrj\") pod \"image-registry-66df7c8f76-k4hgn\" (UID: \"b31e6f23-b5b7-4d64-b256-e57b5f7c9d01\") " pod="openshift-image-registry/image-registry-66df7c8f76-k4hgn" Feb 16 19:48:09 crc kubenswrapper[4675]: I0216 19:48:09.474986 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b31e6f23-b5b7-4d64-b256-e57b5f7c9d01-ca-trust-extracted\") pod \"image-registry-66df7c8f76-k4hgn\" (UID: \"b31e6f23-b5b7-4d64-b256-e57b5f7c9d01\") " pod="openshift-image-registry/image-registry-66df7c8f76-k4hgn" Feb 16 19:48:09 crc kubenswrapper[4675]: I0216 19:48:09.475020 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b31e6f23-b5b7-4d64-b256-e57b5f7c9d01-registry-tls\") pod \"image-registry-66df7c8f76-k4hgn\" (UID: \"b31e6f23-b5b7-4d64-b256-e57b5f7c9d01\") " pod="openshift-image-registry/image-registry-66df7c8f76-k4hgn" Feb 16 19:48:09 crc kubenswrapper[4675]: I0216 19:48:09.475059 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b31e6f23-b5b7-4d64-b256-e57b5f7c9d01-trusted-ca\") pod \"image-registry-66df7c8f76-k4hgn\" (UID: \"b31e6f23-b5b7-4d64-b256-e57b5f7c9d01\") " pod="openshift-image-registry/image-registry-66df7c8f76-k4hgn" Feb 16 19:48:09 crc kubenswrapper[4675]: I0216 19:48:09.475102 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-k4hgn\" (UID: \"b31e6f23-b5b7-4d64-b256-e57b5f7c9d01\") " pod="openshift-image-registry/image-registry-66df7c8f76-k4hgn" Feb 16 19:48:09 crc kubenswrapper[4675]: I0216 19:48:09.516845 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-k4hgn\" (UID: \"b31e6f23-b5b7-4d64-b256-e57b5f7c9d01\") " pod="openshift-image-registry/image-registry-66df7c8f76-k4hgn" Feb 16 19:48:09 crc kubenswrapper[4675]: I0216 19:48:09.577248 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b31e6f23-b5b7-4d64-b256-e57b5f7c9d01-bound-sa-token\") pod \"image-registry-66df7c8f76-k4hgn\" (UID: \"b31e6f23-b5b7-4d64-b256-e57b5f7c9d01\") " pod="openshift-image-registry/image-registry-66df7c8f76-k4hgn" Feb 16 19:48:09 crc kubenswrapper[4675]: I0216 19:48:09.577351 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b31e6f23-b5b7-4d64-b256-e57b5f7c9d01-installation-pull-secrets\") pod \"image-registry-66df7c8f76-k4hgn\" (UID: \"b31e6f23-b5b7-4d64-b256-e57b5f7c9d01\") " pod="openshift-image-registry/image-registry-66df7c8f76-k4hgn" Feb 16 19:48:09 crc kubenswrapper[4675]: I0216 19:48:09.577392 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b31e6f23-b5b7-4d64-b256-e57b5f7c9d01-registry-certificates\") pod \"image-registry-66df7c8f76-k4hgn\" (UID: \"b31e6f23-b5b7-4d64-b256-e57b5f7c9d01\") " pod="openshift-image-registry/image-registry-66df7c8f76-k4hgn" Feb 16 19:48:09 crc kubenswrapper[4675]: I0216 19:48:09.577418 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xgrj\" (UniqueName: \"kubernetes.io/projected/b31e6f23-b5b7-4d64-b256-e57b5f7c9d01-kube-api-access-4xgrj\") pod \"image-registry-66df7c8f76-k4hgn\" (UID: \"b31e6f23-b5b7-4d64-b256-e57b5f7c9d01\") " pod="openshift-image-registry/image-registry-66df7c8f76-k4hgn" Feb 16 19:48:09 crc kubenswrapper[4675]: I0216 19:48:09.577460 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b31e6f23-b5b7-4d64-b256-e57b5f7c9d01-ca-trust-extracted\") pod \"image-registry-66df7c8f76-k4hgn\" (UID: \"b31e6f23-b5b7-4d64-b256-e57b5f7c9d01\") " pod="openshift-image-registry/image-registry-66df7c8f76-k4hgn" Feb 16 19:48:09 crc kubenswrapper[4675]: I0216 19:48:09.577492 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b31e6f23-b5b7-4d64-b256-e57b5f7c9d01-registry-tls\") pod \"image-registry-66df7c8f76-k4hgn\" (UID: \"b31e6f23-b5b7-4d64-b256-e57b5f7c9d01\") " pod="openshift-image-registry/image-registry-66df7c8f76-k4hgn" Feb 16 19:48:09 crc kubenswrapper[4675]: I0216 19:48:09.577533 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b31e6f23-b5b7-4d64-b256-e57b5f7c9d01-trusted-ca\") pod \"image-registry-66df7c8f76-k4hgn\" (UID: \"b31e6f23-b5b7-4d64-b256-e57b5f7c9d01\") " pod="openshift-image-registry/image-registry-66df7c8f76-k4hgn" Feb 16 19:48:09 crc kubenswrapper[4675]: I0216 19:48:09.578328 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b31e6f23-b5b7-4d64-b256-e57b5f7c9d01-ca-trust-extracted\") pod \"image-registry-66df7c8f76-k4hgn\" (UID: \"b31e6f23-b5b7-4d64-b256-e57b5f7c9d01\") " pod="openshift-image-registry/image-registry-66df7c8f76-k4hgn" Feb 16 19:48:09 crc kubenswrapper[4675]: I0216 19:48:09.579171 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b31e6f23-b5b7-4d64-b256-e57b5f7c9d01-trusted-ca\") pod \"image-registry-66df7c8f76-k4hgn\" (UID: \"b31e6f23-b5b7-4d64-b256-e57b5f7c9d01\") " pod="openshift-image-registry/image-registry-66df7c8f76-k4hgn" Feb 16 19:48:09 crc kubenswrapper[4675]: I0216 19:48:09.579269 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b31e6f23-b5b7-4d64-b256-e57b5f7c9d01-registry-certificates\") pod \"image-registry-66df7c8f76-k4hgn\" (UID: \"b31e6f23-b5b7-4d64-b256-e57b5f7c9d01\") " pod="openshift-image-registry/image-registry-66df7c8f76-k4hgn" Feb 16 19:48:09 crc kubenswrapper[4675]: I0216 19:48:09.586839 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b31e6f23-b5b7-4d64-b256-e57b5f7c9d01-installation-pull-secrets\") pod \"image-registry-66df7c8f76-k4hgn\" (UID: \"b31e6f23-b5b7-4d64-b256-e57b5f7c9d01\") " pod="openshift-image-registry/image-registry-66df7c8f76-k4hgn" Feb 16 19:48:09 crc kubenswrapper[4675]: I0216 19:48:09.588501 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b31e6f23-b5b7-4d64-b256-e57b5f7c9d01-registry-tls\") pod \"image-registry-66df7c8f76-k4hgn\" (UID: \"b31e6f23-b5b7-4d64-b256-e57b5f7c9d01\") " pod="openshift-image-registry/image-registry-66df7c8f76-k4hgn" Feb 16 19:48:09 crc kubenswrapper[4675]: I0216 19:48:09.596826 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b31e6f23-b5b7-4d64-b256-e57b5f7c9d01-bound-sa-token\") pod \"image-registry-66df7c8f76-k4hgn\" (UID: \"b31e6f23-b5b7-4d64-b256-e57b5f7c9d01\") " pod="openshift-image-registry/image-registry-66df7c8f76-k4hgn" Feb 16 19:48:09 crc kubenswrapper[4675]: I0216 19:48:09.600436 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xgrj\" (UniqueName: \"kubernetes.io/projected/b31e6f23-b5b7-4d64-b256-e57b5f7c9d01-kube-api-access-4xgrj\") pod \"image-registry-66df7c8f76-k4hgn\" (UID: \"b31e6f23-b5b7-4d64-b256-e57b5f7c9d01\") " pod="openshift-image-registry/image-registry-66df7c8f76-k4hgn" Feb 16 19:48:09 crc kubenswrapper[4675]: I0216 19:48:09.622886 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-k4hgn" Feb 16 19:48:09 crc kubenswrapper[4675]: I0216 19:48:09.866211 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-k4hgn"] Feb 16 19:48:10 crc kubenswrapper[4675]: I0216 19:48:10.249220 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-k4hgn" event={"ID":"b31e6f23-b5b7-4d64-b256-e57b5f7c9d01","Type":"ContainerStarted","Data":"dcd2fb0b03fbce78bf41f461a1b235cb7435d33488935a0235863e7f6c530410"} Feb 16 19:48:10 crc kubenswrapper[4675]: I0216 19:48:10.249300 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-k4hgn" event={"ID":"b31e6f23-b5b7-4d64-b256-e57b5f7c9d01","Type":"ContainerStarted","Data":"9a75c160a97bed2de40a278fc84bd17e2163d9d60223cf507b29620c8cdc8189"} Feb 16 19:48:10 crc kubenswrapper[4675]: I0216 19:48:10.249636 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-k4hgn" Feb 16 19:48:10 crc kubenswrapper[4675]: I0216 19:48:10.251556 4675 generic.go:334] "Generic (PLEG): container finished" podID="fa11d522-5f3a-456b-b8f4-9e60fd4ad519" containerID="2f1b132eb6a9182eac5ce8dcc20eefb996271f418d86d227608b905465fe7b12" exitCode=0 Feb 16 19:48:10 crc kubenswrapper[4675]: I0216 19:48:10.251622 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xxn99" event={"ID":"fa11d522-5f3a-456b-b8f4-9e60fd4ad519","Type":"ContainerDied","Data":"2f1b132eb6a9182eac5ce8dcc20eefb996271f418d86d227608b905465fe7b12"} Feb 16 19:48:10 crc kubenswrapper[4675]: I0216 19:48:10.253711 4675 generic.go:334] "Generic (PLEG): container finished" podID="fe82aa8d-5f71-47cf-9c1f-6b86a66fe07b" containerID="8135cb7349e15385ba22216ff9ab8208657219cb89d7aca6597e542e866f6c1d" exitCode=0 Feb 16 19:48:10 crc kubenswrapper[4675]: I0216 19:48:10.253778 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wdjm9" event={"ID":"fe82aa8d-5f71-47cf-9c1f-6b86a66fe07b","Type":"ContainerDied","Data":"8135cb7349e15385ba22216ff9ab8208657219cb89d7aca6597e542e866f6c1d"} Feb 16 19:48:10 crc kubenswrapper[4675]: I0216 19:48:10.260892 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rnmhj" event={"ID":"e3ad6b2a-aa4d-45e9-8931-019ca2d29e19","Type":"ContainerStarted","Data":"c945dbb7eba4ded6a2702b24582b832d2d32259008b22f74ea0f5ce90ec6a508"} Feb 16 19:48:10 crc kubenswrapper[4675]: I0216 19:48:10.267324 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8hb2k" event={"ID":"1984d92c-2f8f-431e-9006-2a8e14bad660","Type":"ContainerStarted","Data":"4bac1baa3f43b1e918b5f6db17874ed74f6ea9179c8e9b0c502c9172e3887193"} Feb 16 19:48:10 crc kubenswrapper[4675]: I0216 19:48:10.283786 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-k4hgn" podStartSLOduration=1.283757132 podStartE2EDuration="1.283757132s" podCreationTimestamp="2026-02-16 19:48:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 19:48:10.278190927 +0000 UTC m=+373.403480483" watchObservedRunningTime="2026-02-16 19:48:10.283757132 +0000 UTC m=+373.409046698" Feb 16 19:48:10 crc kubenswrapper[4675]: I0216 19:48:10.302401 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rnmhj" podStartSLOduration=3.821405601 podStartE2EDuration="6.302377012s" podCreationTimestamp="2026-02-16 19:48:04 +0000 UTC" firstStartedPulling="2026-02-16 19:48:07.178945837 +0000 UTC m=+370.304235393" lastFinishedPulling="2026-02-16 19:48:09.659917248 +0000 UTC m=+372.785206804" observedRunningTime="2026-02-16 19:48:10.298412641 +0000 UTC m=+373.423702227" watchObservedRunningTime="2026-02-16 19:48:10.302377012 +0000 UTC m=+373.427666558" Feb 16 19:48:11 crc kubenswrapper[4675]: I0216 19:48:11.274286 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wdjm9" event={"ID":"fe82aa8d-5f71-47cf-9c1f-6b86a66fe07b","Type":"ContainerStarted","Data":"e356ee5d42c066c17277d870dd213e149e412700f229b336fe2ad4412dbbc501"} Feb 16 19:48:11 crc kubenswrapper[4675]: I0216 19:48:11.277815 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xxn99" event={"ID":"fa11d522-5f3a-456b-b8f4-9e60fd4ad519","Type":"ContainerStarted","Data":"c6ba62b1877bdb6468c95429a111eee877f0f8791d1d642cbeeba6aa45fd90e9"} Feb 16 19:48:11 crc kubenswrapper[4675]: I0216 19:48:11.280709 4675 generic.go:334] "Generic (PLEG): container finished" podID="1984d92c-2f8f-431e-9006-2a8e14bad660" containerID="4bac1baa3f43b1e918b5f6db17874ed74f6ea9179c8e9b0c502c9172e3887193" exitCode=0 Feb 16 19:48:11 crc kubenswrapper[4675]: I0216 19:48:11.280851 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8hb2k" event={"ID":"1984d92c-2f8f-431e-9006-2a8e14bad660","Type":"ContainerDied","Data":"4bac1baa3f43b1e918b5f6db17874ed74f6ea9179c8e9b0c502c9172e3887193"} Feb 16 19:48:11 crc kubenswrapper[4675]: I0216 19:48:11.299039 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wdjm9" podStartSLOduration=1.808695698 podStartE2EDuration="4.299018451s" podCreationTimestamp="2026-02-16 19:48:07 +0000 UTC" firstStartedPulling="2026-02-16 19:48:08.1846692 +0000 UTC m=+371.309958756" lastFinishedPulling="2026-02-16 19:48:10.674991953 +0000 UTC m=+373.800281509" observedRunningTime="2026-02-16 19:48:11.29396451 +0000 UTC m=+374.419254066" watchObservedRunningTime="2026-02-16 19:48:11.299018451 +0000 UTC m=+374.424308007" Feb 16 19:48:11 crc kubenswrapper[4675]: I0216 19:48:11.366053 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xxn99" podStartSLOduration=3.931289961 podStartE2EDuration="6.366029032s" podCreationTimestamp="2026-02-16 19:48:05 +0000 UTC" firstStartedPulling="2026-02-16 19:48:08.21617818 +0000 UTC m=+371.341467736" lastFinishedPulling="2026-02-16 19:48:10.650917251 +0000 UTC m=+373.776206807" observedRunningTime="2026-02-16 19:48:11.361599088 +0000 UTC m=+374.486888634" watchObservedRunningTime="2026-02-16 19:48:11.366029032 +0000 UTC m=+374.491318588" Feb 16 19:48:12 crc kubenswrapper[4675]: I0216 19:48:12.290209 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8hb2k" event={"ID":"1984d92c-2f8f-431e-9006-2a8e14bad660","Type":"ContainerStarted","Data":"b9f83c5d8a9e736004f1db5c1460175df31ed9aae7bd5bace490c304e8ba7e13"} Feb 16 19:48:12 crc kubenswrapper[4675]: I0216 19:48:12.329499 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8hb2k" podStartSLOduration=2.842108065 podStartE2EDuration="5.329451704s" podCreationTimestamp="2026-02-16 19:48:07 +0000 UTC" firstStartedPulling="2026-02-16 19:48:09.230897244 +0000 UTC m=+372.356186800" lastFinishedPulling="2026-02-16 19:48:11.718240883 +0000 UTC m=+374.843530439" observedRunningTime="2026-02-16 19:48:12.320951627 +0000 UTC m=+375.446241203" watchObservedRunningTime="2026-02-16 19:48:12.329451704 +0000 UTC m=+375.454741260" Feb 16 19:48:15 crc kubenswrapper[4675]: I0216 19:48:15.161868 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rnmhj" Feb 16 19:48:15 crc kubenswrapper[4675]: I0216 19:48:15.162321 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rnmhj" Feb 16 19:48:15 crc kubenswrapper[4675]: I0216 19:48:15.219416 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rnmhj" Feb 16 19:48:15 crc kubenswrapper[4675]: I0216 19:48:15.349354 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rnmhj" Feb 16 19:48:16 crc kubenswrapper[4675]: I0216 19:48:16.816893 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xxn99" Feb 16 19:48:16 crc kubenswrapper[4675]: I0216 19:48:16.816997 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xxn99" Feb 16 19:48:16 crc kubenswrapper[4675]: I0216 19:48:16.896608 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xxn99" Feb 16 19:48:17 crc kubenswrapper[4675]: I0216 19:48:17.363873 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xxn99" Feb 16 19:48:17 crc kubenswrapper[4675]: I0216 19:48:17.548894 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wdjm9" Feb 16 19:48:17 crc kubenswrapper[4675]: I0216 19:48:17.548942 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wdjm9" Feb 16 19:48:17 crc kubenswrapper[4675]: I0216 19:48:17.554373 4675 patch_prober.go:28] interesting pod/machine-config-daemon-j7pnb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 16 19:48:17 crc kubenswrapper[4675]: I0216 19:48:17.554459 4675 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" podUID="10414964-83d0-4d95-a89f-e3212a8015b5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 16 19:48:17 crc kubenswrapper[4675]: I0216 19:48:17.586742 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wdjm9" Feb 16 19:48:18 crc kubenswrapper[4675]: I0216 19:48:18.164373 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8hb2k" Feb 16 19:48:18 crc kubenswrapper[4675]: I0216 19:48:18.165772 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8hb2k" Feb 16 19:48:18 crc kubenswrapper[4675]: I0216 19:48:18.211642 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8hb2k" Feb 16 19:48:18 crc kubenswrapper[4675]: I0216 19:48:18.382883 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wdjm9" Feb 16 19:48:18 crc kubenswrapper[4675]: I0216 19:48:18.388577 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8hb2k" Feb 16 19:48:29 crc kubenswrapper[4675]: I0216 19:48:29.630123 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-k4hgn" Feb 16 19:48:29 crc kubenswrapper[4675]: I0216 19:48:29.695832 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ctjpc"] Feb 16 19:48:47 crc kubenswrapper[4675]: I0216 19:48:47.553814 4675 patch_prober.go:28] interesting pod/machine-config-daemon-j7pnb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 16 19:48:47 crc kubenswrapper[4675]: I0216 19:48:47.554563 4675 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" podUID="10414964-83d0-4d95-a89f-e3212a8015b5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 16 19:48:47 crc kubenswrapper[4675]: I0216 19:48:47.554627 4675 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" Feb 16 19:48:47 crc kubenswrapper[4675]: I0216 19:48:47.555465 4675 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8f5e8d3ef71727a87301b2848355647df663baf2a84aa1256862d3fc0de85ce6"} pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 16 19:48:47 crc kubenswrapper[4675]: I0216 19:48:47.555547 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" podUID="10414964-83d0-4d95-a89f-e3212a8015b5" containerName="machine-config-daemon" containerID="cri-o://8f5e8d3ef71727a87301b2848355647df663baf2a84aa1256862d3fc0de85ce6" gracePeriod=600 Feb 16 19:48:48 crc kubenswrapper[4675]: I0216 19:48:48.524815 4675 generic.go:334] "Generic (PLEG): container finished" podID="10414964-83d0-4d95-a89f-e3212a8015b5" containerID="8f5e8d3ef71727a87301b2848355647df663baf2a84aa1256862d3fc0de85ce6" exitCode=0 Feb 16 19:48:48 crc kubenswrapper[4675]: I0216 19:48:48.525106 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" event={"ID":"10414964-83d0-4d95-a89f-e3212a8015b5","Type":"ContainerDied","Data":"8f5e8d3ef71727a87301b2848355647df663baf2a84aa1256862d3fc0de85ce6"} Feb 16 19:48:48 crc kubenswrapper[4675]: I0216 19:48:48.525311 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" event={"ID":"10414964-83d0-4d95-a89f-e3212a8015b5","Type":"ContainerStarted","Data":"069e42dd31d4cde076111a9692275d9e8e389cb1174b34be830192e3cff8cc6c"} Feb 16 19:48:48 crc kubenswrapper[4675]: I0216 19:48:48.525349 4675 scope.go:117] "RemoveContainer" containerID="92e33c01bc9214841f42d4ca6e58fb062a9e94fe39bdc5ad08ad2351cd270058" Feb 16 19:48:54 crc kubenswrapper[4675]: I0216 19:48:54.770547 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-ctjpc" podUID="2a8b7c95-1c8f-4de9-907c-34b0b0848b13" containerName="registry" containerID="cri-o://9bf239a77dc428c119502333d5d52fda020025c93f856b9b004cc880a1b2ffc2" gracePeriod=30 Feb 16 19:48:55 crc kubenswrapper[4675]: I0216 19:48:55.228878 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-ctjpc" Feb 16 19:48:55 crc kubenswrapper[4675]: I0216 19:48:55.338257 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2a8b7c95-1c8f-4de9-907c-34b0b0848b13-installation-pull-secrets\") pod \"2a8b7c95-1c8f-4de9-907c-34b0b0848b13\" (UID: \"2a8b7c95-1c8f-4de9-907c-34b0b0848b13\") " Feb 16 19:48:55 crc kubenswrapper[4675]: I0216 19:48:55.338599 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"2a8b7c95-1c8f-4de9-907c-34b0b0848b13\" (UID: \"2a8b7c95-1c8f-4de9-907c-34b0b0848b13\") " Feb 16 19:48:55 crc kubenswrapper[4675]: I0216 19:48:55.338634 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2a8b7c95-1c8f-4de9-907c-34b0b0848b13-ca-trust-extracted\") pod \"2a8b7c95-1c8f-4de9-907c-34b0b0848b13\" (UID: \"2a8b7c95-1c8f-4de9-907c-34b0b0848b13\") " Feb 16 19:48:55 crc kubenswrapper[4675]: I0216 19:48:55.338714 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2a8b7c95-1c8f-4de9-907c-34b0b0848b13-bound-sa-token\") pod \"2a8b7c95-1c8f-4de9-907c-34b0b0848b13\" (UID: \"2a8b7c95-1c8f-4de9-907c-34b0b0848b13\") " Feb 16 19:48:55 crc kubenswrapper[4675]: I0216 19:48:55.338741 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rzzc5\" (UniqueName: \"kubernetes.io/projected/2a8b7c95-1c8f-4de9-907c-34b0b0848b13-kube-api-access-rzzc5\") pod \"2a8b7c95-1c8f-4de9-907c-34b0b0848b13\" (UID: \"2a8b7c95-1c8f-4de9-907c-34b0b0848b13\") " Feb 16 19:48:55 crc kubenswrapper[4675]: I0216 19:48:55.338761 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2a8b7c95-1c8f-4de9-907c-34b0b0848b13-registry-certificates\") pod \"2a8b7c95-1c8f-4de9-907c-34b0b0848b13\" (UID: \"2a8b7c95-1c8f-4de9-907c-34b0b0848b13\") " Feb 16 19:48:55 crc kubenswrapper[4675]: I0216 19:48:55.338829 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2a8b7c95-1c8f-4de9-907c-34b0b0848b13-registry-tls\") pod \"2a8b7c95-1c8f-4de9-907c-34b0b0848b13\" (UID: \"2a8b7c95-1c8f-4de9-907c-34b0b0848b13\") " Feb 16 19:48:55 crc kubenswrapper[4675]: I0216 19:48:55.338852 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2a8b7c95-1c8f-4de9-907c-34b0b0848b13-trusted-ca\") pod \"2a8b7c95-1c8f-4de9-907c-34b0b0848b13\" (UID: \"2a8b7c95-1c8f-4de9-907c-34b0b0848b13\") " Feb 16 19:48:55 crc kubenswrapper[4675]: I0216 19:48:55.340127 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a8b7c95-1c8f-4de9-907c-34b0b0848b13-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "2a8b7c95-1c8f-4de9-907c-34b0b0848b13" (UID: "2a8b7c95-1c8f-4de9-907c-34b0b0848b13"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 19:48:55 crc kubenswrapper[4675]: I0216 19:48:55.341673 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a8b7c95-1c8f-4de9-907c-34b0b0848b13-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "2a8b7c95-1c8f-4de9-907c-34b0b0848b13" (UID: "2a8b7c95-1c8f-4de9-907c-34b0b0848b13"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 19:48:55 crc kubenswrapper[4675]: I0216 19:48:55.348268 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a8b7c95-1c8f-4de9-907c-34b0b0848b13-kube-api-access-rzzc5" (OuterVolumeSpecName: "kube-api-access-rzzc5") pod "2a8b7c95-1c8f-4de9-907c-34b0b0848b13" (UID: "2a8b7c95-1c8f-4de9-907c-34b0b0848b13"). InnerVolumeSpecName "kube-api-access-rzzc5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 19:48:55 crc kubenswrapper[4675]: I0216 19:48:55.348493 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a8b7c95-1c8f-4de9-907c-34b0b0848b13-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "2a8b7c95-1c8f-4de9-907c-34b0b0848b13" (UID: "2a8b7c95-1c8f-4de9-907c-34b0b0848b13"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 19:48:55 crc kubenswrapper[4675]: I0216 19:48:55.349320 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a8b7c95-1c8f-4de9-907c-34b0b0848b13-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "2a8b7c95-1c8f-4de9-907c-34b0b0848b13" (UID: "2a8b7c95-1c8f-4de9-907c-34b0b0848b13"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 19:48:55 crc kubenswrapper[4675]: I0216 19:48:55.354899 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a8b7c95-1c8f-4de9-907c-34b0b0848b13-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "2a8b7c95-1c8f-4de9-907c-34b0b0848b13" (UID: "2a8b7c95-1c8f-4de9-907c-34b0b0848b13"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 19:48:55 crc kubenswrapper[4675]: I0216 19:48:55.358353 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "2a8b7c95-1c8f-4de9-907c-34b0b0848b13" (UID: "2a8b7c95-1c8f-4de9-907c-34b0b0848b13"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 16 19:48:55 crc kubenswrapper[4675]: I0216 19:48:55.361329 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a8b7c95-1c8f-4de9-907c-34b0b0848b13-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "2a8b7c95-1c8f-4de9-907c-34b0b0848b13" (UID: "2a8b7c95-1c8f-4de9-907c-34b0b0848b13"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 19:48:55 crc kubenswrapper[4675]: I0216 19:48:55.441212 4675 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2a8b7c95-1c8f-4de9-907c-34b0b0848b13-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 16 19:48:55 crc kubenswrapper[4675]: I0216 19:48:55.441284 4675 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2a8b7c95-1c8f-4de9-907c-34b0b0848b13-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 16 19:48:55 crc kubenswrapper[4675]: I0216 19:48:55.441299 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rzzc5\" (UniqueName: \"kubernetes.io/projected/2a8b7c95-1c8f-4de9-907c-34b0b0848b13-kube-api-access-rzzc5\") on node \"crc\" DevicePath \"\"" Feb 16 19:48:55 crc kubenswrapper[4675]: I0216 19:48:55.441315 4675 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2a8b7c95-1c8f-4de9-907c-34b0b0848b13-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 16 19:48:55 crc kubenswrapper[4675]: I0216 19:48:55.441329 4675 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2a8b7c95-1c8f-4de9-907c-34b0b0848b13-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 16 19:48:55 crc kubenswrapper[4675]: I0216 19:48:55.441341 4675 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2a8b7c95-1c8f-4de9-907c-34b0b0848b13-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 16 19:48:55 crc kubenswrapper[4675]: I0216 19:48:55.441357 4675 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2a8b7c95-1c8f-4de9-907c-34b0b0848b13-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 16 19:48:55 crc kubenswrapper[4675]: I0216 19:48:55.587457 4675 generic.go:334] "Generic (PLEG): container finished" podID="2a8b7c95-1c8f-4de9-907c-34b0b0848b13" containerID="9bf239a77dc428c119502333d5d52fda020025c93f856b9b004cc880a1b2ffc2" exitCode=0 Feb 16 19:48:55 crc kubenswrapper[4675]: I0216 19:48:55.587517 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-ctjpc" event={"ID":"2a8b7c95-1c8f-4de9-907c-34b0b0848b13","Type":"ContainerDied","Data":"9bf239a77dc428c119502333d5d52fda020025c93f856b9b004cc880a1b2ffc2"} Feb 16 19:48:55 crc kubenswrapper[4675]: I0216 19:48:55.587548 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-ctjpc" Feb 16 19:48:55 crc kubenswrapper[4675]: I0216 19:48:55.587566 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-ctjpc" event={"ID":"2a8b7c95-1c8f-4de9-907c-34b0b0848b13","Type":"ContainerDied","Data":"0c9d7007fe2bfbd301fb6f11d385445d37255643b4ab60860b542f783eb1f478"} Feb 16 19:48:55 crc kubenswrapper[4675]: I0216 19:48:55.587591 4675 scope.go:117] "RemoveContainer" containerID="9bf239a77dc428c119502333d5d52fda020025c93f856b9b004cc880a1b2ffc2" Feb 16 19:48:55 crc kubenswrapper[4675]: I0216 19:48:55.617453 4675 scope.go:117] "RemoveContainer" containerID="9bf239a77dc428c119502333d5d52fda020025c93f856b9b004cc880a1b2ffc2" Feb 16 19:48:55 crc kubenswrapper[4675]: E0216 19:48:55.618432 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9bf239a77dc428c119502333d5d52fda020025c93f856b9b004cc880a1b2ffc2\": container with ID starting with 9bf239a77dc428c119502333d5d52fda020025c93f856b9b004cc880a1b2ffc2 not found: ID does not exist" containerID="9bf239a77dc428c119502333d5d52fda020025c93f856b9b004cc880a1b2ffc2" Feb 16 19:48:55 crc kubenswrapper[4675]: I0216 19:48:55.618510 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9bf239a77dc428c119502333d5d52fda020025c93f856b9b004cc880a1b2ffc2"} err="failed to get container status \"9bf239a77dc428c119502333d5d52fda020025c93f856b9b004cc880a1b2ffc2\": rpc error: code = NotFound desc = could not find container \"9bf239a77dc428c119502333d5d52fda020025c93f856b9b004cc880a1b2ffc2\": container with ID starting with 9bf239a77dc428c119502333d5d52fda020025c93f856b9b004cc880a1b2ffc2 not found: ID does not exist" Feb 16 19:48:55 crc kubenswrapper[4675]: I0216 19:48:55.638121 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ctjpc"] Feb 16 19:48:55 crc kubenswrapper[4675]: I0216 19:48:55.645288 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ctjpc"] Feb 16 19:48:55 crc kubenswrapper[4675]: I0216 19:48:55.894237 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a8b7c95-1c8f-4de9-907c-34b0b0848b13" path="/var/lib/kubelet/pods/2a8b7c95-1c8f-4de9-907c-34b0b0848b13/volumes" Feb 16 19:50:47 crc kubenswrapper[4675]: I0216 19:50:47.554081 4675 patch_prober.go:28] interesting pod/machine-config-daemon-j7pnb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 16 19:50:47 crc kubenswrapper[4675]: I0216 19:50:47.555311 4675 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" podUID="10414964-83d0-4d95-a89f-e3212a8015b5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 16 19:51:17 crc kubenswrapper[4675]: I0216 19:51:17.553604 4675 patch_prober.go:28] interesting pod/machine-config-daemon-j7pnb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 16 19:51:17 crc kubenswrapper[4675]: I0216 19:51:17.554591 4675 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" podUID="10414964-83d0-4d95-a89f-e3212a8015b5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 16 19:51:38 crc kubenswrapper[4675]: I0216 19:51:38.520569 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-j4tq2"] Feb 16 19:51:38 crc kubenswrapper[4675]: E0216 19:51:38.521475 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a8b7c95-1c8f-4de9-907c-34b0b0848b13" containerName="registry" Feb 16 19:51:38 crc kubenswrapper[4675]: I0216 19:51:38.521494 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a8b7c95-1c8f-4de9-907c-34b0b0848b13" containerName="registry" Feb 16 19:51:38 crc kubenswrapper[4675]: I0216 19:51:38.521623 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a8b7c95-1c8f-4de9-907c-34b0b0848b13" containerName="registry" Feb 16 19:51:38 crc kubenswrapper[4675]: I0216 19:51:38.522123 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-j4tq2" Feb 16 19:51:38 crc kubenswrapper[4675]: I0216 19:51:38.530204 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Feb 16 19:51:38 crc kubenswrapper[4675]: I0216 19:51:38.530706 4675 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-ptg2n" Feb 16 19:51:38 crc kubenswrapper[4675]: I0216 19:51:38.538465 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Feb 16 19:51:38 crc kubenswrapper[4675]: I0216 19:51:38.542001 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9qjp\" (UniqueName: \"kubernetes.io/projected/8acb6f4c-29a5-4b86-bba5-3754ae8ec0ea-kube-api-access-f9qjp\") pod \"cert-manager-cainjector-cf98fcc89-j4tq2\" (UID: \"8acb6f4c-29a5-4b86-bba5-3754ae8ec0ea\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-j4tq2" Feb 16 19:51:38 crc kubenswrapper[4675]: I0216 19:51:38.542430 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-l4grk"] Feb 16 19:51:38 crc kubenswrapper[4675]: I0216 19:51:38.543182 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-l4grk" Feb 16 19:51:38 crc kubenswrapper[4675]: I0216 19:51:38.544733 4675 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-bgl76" Feb 16 19:51:38 crc kubenswrapper[4675]: I0216 19:51:38.561277 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-l4grk"] Feb 16 19:51:38 crc kubenswrapper[4675]: I0216 19:51:38.567514 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-hmxhx"] Feb 16 19:51:38 crc kubenswrapper[4675]: I0216 19:51:38.573485 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-hmxhx" Feb 16 19:51:38 crc kubenswrapper[4675]: I0216 19:51:38.576609 4675 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-w8cbt" Feb 16 19:51:38 crc kubenswrapper[4675]: I0216 19:51:38.597411 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-j4tq2"] Feb 16 19:51:38 crc kubenswrapper[4675]: I0216 19:51:38.631777 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-hmxhx"] Feb 16 19:51:38 crc kubenswrapper[4675]: I0216 19:51:38.681665 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cr8s\" (UniqueName: \"kubernetes.io/projected/226e5ab0-119e-40ef-86f9-e2243de79a09-kube-api-access-2cr8s\") pod \"cert-manager-858654f9db-l4grk\" (UID: \"226e5ab0-119e-40ef-86f9-e2243de79a09\") " pod="cert-manager/cert-manager-858654f9db-l4grk" Feb 16 19:51:38 crc kubenswrapper[4675]: I0216 19:51:38.681759 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9qjp\" (UniqueName: \"kubernetes.io/projected/8acb6f4c-29a5-4b86-bba5-3754ae8ec0ea-kube-api-access-f9qjp\") pod \"cert-manager-cainjector-cf98fcc89-j4tq2\" (UID: \"8acb6f4c-29a5-4b86-bba5-3754ae8ec0ea\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-j4tq2" Feb 16 19:51:38 crc kubenswrapper[4675]: I0216 19:51:38.681836 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9rhx\" (UniqueName: \"kubernetes.io/projected/b779aec3-fe64-4ef2-ba6b-8caf7b98a984-kube-api-access-p9rhx\") pod \"cert-manager-webhook-687f57d79b-hmxhx\" (UID: \"b779aec3-fe64-4ef2-ba6b-8caf7b98a984\") " pod="cert-manager/cert-manager-webhook-687f57d79b-hmxhx" Feb 16 19:51:38 crc kubenswrapper[4675]: I0216 19:51:38.737661 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9qjp\" (UniqueName: \"kubernetes.io/projected/8acb6f4c-29a5-4b86-bba5-3754ae8ec0ea-kube-api-access-f9qjp\") pod \"cert-manager-cainjector-cf98fcc89-j4tq2\" (UID: \"8acb6f4c-29a5-4b86-bba5-3754ae8ec0ea\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-j4tq2" Feb 16 19:51:38 crc kubenswrapper[4675]: I0216 19:51:38.783475 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9rhx\" (UniqueName: \"kubernetes.io/projected/b779aec3-fe64-4ef2-ba6b-8caf7b98a984-kube-api-access-p9rhx\") pod \"cert-manager-webhook-687f57d79b-hmxhx\" (UID: \"b779aec3-fe64-4ef2-ba6b-8caf7b98a984\") " pod="cert-manager/cert-manager-webhook-687f57d79b-hmxhx" Feb 16 19:51:38 crc kubenswrapper[4675]: I0216 19:51:38.783558 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cr8s\" (UniqueName: \"kubernetes.io/projected/226e5ab0-119e-40ef-86f9-e2243de79a09-kube-api-access-2cr8s\") pod \"cert-manager-858654f9db-l4grk\" (UID: \"226e5ab0-119e-40ef-86f9-e2243de79a09\") " pod="cert-manager/cert-manager-858654f9db-l4grk" Feb 16 19:51:38 crc kubenswrapper[4675]: I0216 19:51:38.799001 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9rhx\" (UniqueName: \"kubernetes.io/projected/b779aec3-fe64-4ef2-ba6b-8caf7b98a984-kube-api-access-p9rhx\") pod \"cert-manager-webhook-687f57d79b-hmxhx\" (UID: \"b779aec3-fe64-4ef2-ba6b-8caf7b98a984\") " pod="cert-manager/cert-manager-webhook-687f57d79b-hmxhx" Feb 16 19:51:38 crc kubenswrapper[4675]: I0216 19:51:38.809484 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cr8s\" (UniqueName: \"kubernetes.io/projected/226e5ab0-119e-40ef-86f9-e2243de79a09-kube-api-access-2cr8s\") pod \"cert-manager-858654f9db-l4grk\" (UID: \"226e5ab0-119e-40ef-86f9-e2243de79a09\") " pod="cert-manager/cert-manager-858654f9db-l4grk" Feb 16 19:51:38 crc kubenswrapper[4675]: I0216 19:51:38.840988 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-j4tq2" Feb 16 19:51:38 crc kubenswrapper[4675]: I0216 19:51:38.856943 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-l4grk" Feb 16 19:51:38 crc kubenswrapper[4675]: I0216 19:51:38.910186 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-hmxhx" Feb 16 19:51:39 crc kubenswrapper[4675]: I0216 19:51:39.084547 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-l4grk"] Feb 16 19:51:39 crc kubenswrapper[4675]: I0216 19:51:39.094911 4675 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 16 19:51:39 crc kubenswrapper[4675]: I0216 19:51:39.124594 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-j4tq2"] Feb 16 19:51:39 crc kubenswrapper[4675]: W0216 19:51:39.134912 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8acb6f4c_29a5_4b86_bba5_3754ae8ec0ea.slice/crio-df68c8eb18b68296c8caf36f55387f632e630e45aef8265fb367c14e62e20f94 WatchSource:0}: Error finding container df68c8eb18b68296c8caf36f55387f632e630e45aef8265fb367c14e62e20f94: Status 404 returned error can't find the container with id df68c8eb18b68296c8caf36f55387f632e630e45aef8265fb367c14e62e20f94 Feb 16 19:51:39 crc kubenswrapper[4675]: I0216 19:51:39.187386 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-hmxhx"] Feb 16 19:51:39 crc kubenswrapper[4675]: W0216 19:51:39.189757 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb779aec3_fe64_4ef2_ba6b_8caf7b98a984.slice/crio-04d33f7055cb6c6ae81385bae4e5c087e727af1dd7d6e2477c4dfb49368d31d3 WatchSource:0}: Error finding container 04d33f7055cb6c6ae81385bae4e5c087e727af1dd7d6e2477c4dfb49368d31d3: Status 404 returned error can't find the container with id 04d33f7055cb6c6ae81385bae4e5c087e727af1dd7d6e2477c4dfb49368d31d3 Feb 16 19:51:39 crc kubenswrapper[4675]: I0216 19:51:39.746166 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-hmxhx" event={"ID":"b779aec3-fe64-4ef2-ba6b-8caf7b98a984","Type":"ContainerStarted","Data":"04d33f7055cb6c6ae81385bae4e5c087e727af1dd7d6e2477c4dfb49368d31d3"} Feb 16 19:51:39 crc kubenswrapper[4675]: I0216 19:51:39.747294 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-l4grk" event={"ID":"226e5ab0-119e-40ef-86f9-e2243de79a09","Type":"ContainerStarted","Data":"fc483eea52823f90dc0330402e945ba1758e50f3bb7f36ba592b722fbace82b7"} Feb 16 19:51:39 crc kubenswrapper[4675]: I0216 19:51:39.750087 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-j4tq2" event={"ID":"8acb6f4c-29a5-4b86-bba5-3754ae8ec0ea","Type":"ContainerStarted","Data":"df68c8eb18b68296c8caf36f55387f632e630e45aef8265fb367c14e62e20f94"} Feb 16 19:51:43 crc kubenswrapper[4675]: I0216 19:51:43.788934 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-hmxhx" event={"ID":"b779aec3-fe64-4ef2-ba6b-8caf7b98a984","Type":"ContainerStarted","Data":"4b27808872e1691259b4b11ed5fc0794be1bfc274a4841b9f6316dcd879e47c1"} Feb 16 19:51:43 crc kubenswrapper[4675]: I0216 19:51:43.790131 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-hmxhx" Feb 16 19:51:43 crc kubenswrapper[4675]: I0216 19:51:43.793467 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-l4grk" event={"ID":"226e5ab0-119e-40ef-86f9-e2243de79a09","Type":"ContainerStarted","Data":"adb03b021090af17395518999fcf055184eea7903ed7e0d4b0a68149c22bfb41"} Feb 16 19:51:43 crc kubenswrapper[4675]: I0216 19:51:43.794981 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-j4tq2" event={"ID":"8acb6f4c-29a5-4b86-bba5-3754ae8ec0ea","Type":"ContainerStarted","Data":"8ab3567cb93792713d912da4af7dc7c7afb686d19f1f54f9b8ee4a27d13d5fb8"} Feb 16 19:51:43 crc kubenswrapper[4675]: I0216 19:51:43.810014 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-hmxhx" podStartSLOduration=1.481981006 podStartE2EDuration="5.809990121s" podCreationTimestamp="2026-02-16 19:51:38 +0000 UTC" firstStartedPulling="2026-02-16 19:51:39.192323731 +0000 UTC m=+582.317613297" lastFinishedPulling="2026-02-16 19:51:43.520332816 +0000 UTC m=+586.645622412" observedRunningTime="2026-02-16 19:51:43.809679573 +0000 UTC m=+586.934969129" watchObservedRunningTime="2026-02-16 19:51:43.809990121 +0000 UTC m=+586.935279687" Feb 16 19:51:43 crc kubenswrapper[4675]: I0216 19:51:43.835876 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-l4grk" podStartSLOduration=1.476185299 podStartE2EDuration="5.835844588s" podCreationTimestamp="2026-02-16 19:51:38 +0000 UTC" firstStartedPulling="2026-02-16 19:51:39.094508905 +0000 UTC m=+582.219798461" lastFinishedPulling="2026-02-16 19:51:43.454168184 +0000 UTC m=+586.579457750" observedRunningTime="2026-02-16 19:51:43.830388045 +0000 UTC m=+586.955677651" watchObservedRunningTime="2026-02-16 19:51:43.835844588 +0000 UTC m=+586.961134154" Feb 16 19:51:43 crc kubenswrapper[4675]: I0216 19:51:43.852362 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-j4tq2" podStartSLOduration=1.5556415270000001 podStartE2EDuration="5.85234091s" podCreationTimestamp="2026-02-16 19:51:38 +0000 UTC" firstStartedPulling="2026-02-16 19:51:39.137834337 +0000 UTC m=+582.263123903" lastFinishedPulling="2026-02-16 19:51:43.43453372 +0000 UTC m=+586.559823286" observedRunningTime="2026-02-16 19:51:43.848434957 +0000 UTC m=+586.973724513" watchObservedRunningTime="2026-02-16 19:51:43.85234091 +0000 UTC m=+586.977630456" Feb 16 19:51:47 crc kubenswrapper[4675]: I0216 19:51:47.554595 4675 patch_prober.go:28] interesting pod/machine-config-daemon-j7pnb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 16 19:51:47 crc kubenswrapper[4675]: I0216 19:51:47.555939 4675 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" podUID="10414964-83d0-4d95-a89f-e3212a8015b5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 16 19:51:47 crc kubenswrapper[4675]: I0216 19:51:47.556040 4675 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" Feb 16 19:51:47 crc kubenswrapper[4675]: I0216 19:51:47.557161 4675 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"069e42dd31d4cde076111a9692275d9e8e389cb1174b34be830192e3cff8cc6c"} pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 16 19:51:47 crc kubenswrapper[4675]: I0216 19:51:47.557279 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" podUID="10414964-83d0-4d95-a89f-e3212a8015b5" containerName="machine-config-daemon" containerID="cri-o://069e42dd31d4cde076111a9692275d9e8e389cb1174b34be830192e3cff8cc6c" gracePeriod=600 Feb 16 19:51:47 crc kubenswrapper[4675]: I0216 19:51:47.842584 4675 generic.go:334] "Generic (PLEG): container finished" podID="10414964-83d0-4d95-a89f-e3212a8015b5" containerID="069e42dd31d4cde076111a9692275d9e8e389cb1174b34be830192e3cff8cc6c" exitCode=0 Feb 16 19:51:47 crc kubenswrapper[4675]: I0216 19:51:47.842671 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" event={"ID":"10414964-83d0-4d95-a89f-e3212a8015b5","Type":"ContainerDied","Data":"069e42dd31d4cde076111a9692275d9e8e389cb1174b34be830192e3cff8cc6c"} Feb 16 19:51:47 crc kubenswrapper[4675]: I0216 19:51:47.843202 4675 scope.go:117] "RemoveContainer" containerID="8f5e8d3ef71727a87301b2848355647df663baf2a84aa1256862d3fc0de85ce6" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.431240 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-gpc5z"] Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.432384 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" podUID="9b6e2d5a-0472-425b-b5b4-0b94f14ebfba" containerName="ovn-controller" containerID="cri-o://ace6ee814e8e429b6ec77e88a0bb2fff35cfbfd3497e7a4d7882e11f437ce5a4" gracePeriod=30 Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.432757 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" podUID="9b6e2d5a-0472-425b-b5b4-0b94f14ebfba" containerName="kube-rbac-proxy-node" containerID="cri-o://958d71da673d02ed4c067f390a0a94c9f20052f6274ec975f9a4c3034efa7aaa" gracePeriod=30 Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.432792 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" podUID="9b6e2d5a-0472-425b-b5b4-0b94f14ebfba" containerName="sbdb" containerID="cri-o://d4a449e69885bb506900d7dbb14ee3ff9eb2d1188267babb04fbe0bb8f057c48" gracePeriod=30 Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.432843 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" podUID="9b6e2d5a-0472-425b-b5b4-0b94f14ebfba" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://5282559b57497e15c0bd828702b4994e8d05f7c1a81ee44e582ce3bb3b64cbb8" gracePeriod=30 Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.432809 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" podUID="9b6e2d5a-0472-425b-b5b4-0b94f14ebfba" containerName="northd" containerID="cri-o://1e22795f3990f90b901e1ba026e5f95e72508dc865635d19ba881a291dbaed85" gracePeriod=30 Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.432915 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" podUID="9b6e2d5a-0472-425b-b5b4-0b94f14ebfba" containerName="ovn-acl-logging" containerID="cri-o://9296772284fdb3e0ac56ffd5e54f10e7fe1256ccb616d52f103479ebcc6e81f9" gracePeriod=30 Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.433007 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" podUID="9b6e2d5a-0472-425b-b5b4-0b94f14ebfba" containerName="nbdb" containerID="cri-o://13af49ba7abbe43edde46130d212f6d76f30bab85a63aef18a84721887e29535" gracePeriod=30 Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.469051 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" podUID="9b6e2d5a-0472-425b-b5b4-0b94f14ebfba" containerName="ovnkube-controller" containerID="cri-o://73209f6cc6e17a7237c040c1bc086b40bb5945de0483d1a3082546f8d16c26b1" gracePeriod=30 Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.761839 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gpc5z_9b6e2d5a-0472-425b-b5b4-0b94f14ebfba/ovnkube-controller/3.log" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.766047 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gpc5z_9b6e2d5a-0472-425b-b5b4-0b94f14ebfba/ovn-acl-logging/0.log" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.766567 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gpc5z_9b6e2d5a-0472-425b-b5b4-0b94f14ebfba/ovn-controller/0.log" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.767181 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.841798 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-mgsc6"] Feb 16 19:51:48 crc kubenswrapper[4675]: E0216 19:51:48.842072 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b6e2d5a-0472-425b-b5b4-0b94f14ebfba" containerName="ovnkube-controller" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.842088 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b6e2d5a-0472-425b-b5b4-0b94f14ebfba" containerName="ovnkube-controller" Feb 16 19:51:48 crc kubenswrapper[4675]: E0216 19:51:48.842099 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b6e2d5a-0472-425b-b5b4-0b94f14ebfba" containerName="sbdb" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.842108 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b6e2d5a-0472-425b-b5b4-0b94f14ebfba" containerName="sbdb" Feb 16 19:51:48 crc kubenswrapper[4675]: E0216 19:51:48.842120 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b6e2d5a-0472-425b-b5b4-0b94f14ebfba" containerName="kube-rbac-proxy-ovn-metrics" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.842127 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b6e2d5a-0472-425b-b5b4-0b94f14ebfba" containerName="kube-rbac-proxy-ovn-metrics" Feb 16 19:51:48 crc kubenswrapper[4675]: E0216 19:51:48.842137 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b6e2d5a-0472-425b-b5b4-0b94f14ebfba" containerName="northd" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.842143 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b6e2d5a-0472-425b-b5b4-0b94f14ebfba" containerName="northd" Feb 16 19:51:48 crc kubenswrapper[4675]: E0216 19:51:48.842152 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b6e2d5a-0472-425b-b5b4-0b94f14ebfba" containerName="kubecfg-setup" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.842158 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b6e2d5a-0472-425b-b5b4-0b94f14ebfba" containerName="kubecfg-setup" Feb 16 19:51:48 crc kubenswrapper[4675]: E0216 19:51:48.842168 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b6e2d5a-0472-425b-b5b4-0b94f14ebfba" containerName="kube-rbac-proxy-node" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.842174 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b6e2d5a-0472-425b-b5b4-0b94f14ebfba" containerName="kube-rbac-proxy-node" Feb 16 19:51:48 crc kubenswrapper[4675]: E0216 19:51:48.842184 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b6e2d5a-0472-425b-b5b4-0b94f14ebfba" containerName="ovn-controller" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.842190 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b6e2d5a-0472-425b-b5b4-0b94f14ebfba" containerName="ovn-controller" Feb 16 19:51:48 crc kubenswrapper[4675]: E0216 19:51:48.842198 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b6e2d5a-0472-425b-b5b4-0b94f14ebfba" containerName="ovn-acl-logging" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.842205 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b6e2d5a-0472-425b-b5b4-0b94f14ebfba" containerName="ovn-acl-logging" Feb 16 19:51:48 crc kubenswrapper[4675]: E0216 19:51:48.842214 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b6e2d5a-0472-425b-b5b4-0b94f14ebfba" containerName="ovnkube-controller" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.842220 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b6e2d5a-0472-425b-b5b4-0b94f14ebfba" containerName="ovnkube-controller" Feb 16 19:51:48 crc kubenswrapper[4675]: E0216 19:51:48.842227 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b6e2d5a-0472-425b-b5b4-0b94f14ebfba" containerName="nbdb" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.842232 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b6e2d5a-0472-425b-b5b4-0b94f14ebfba" containerName="nbdb" Feb 16 19:51:48 crc kubenswrapper[4675]: E0216 19:51:48.842241 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b6e2d5a-0472-425b-b5b4-0b94f14ebfba" containerName="ovnkube-controller" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.842246 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b6e2d5a-0472-425b-b5b4-0b94f14ebfba" containerName="ovnkube-controller" Feb 16 19:51:48 crc kubenswrapper[4675]: E0216 19:51:48.842252 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b6e2d5a-0472-425b-b5b4-0b94f14ebfba" containerName="ovnkube-controller" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.842258 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b6e2d5a-0472-425b-b5b4-0b94f14ebfba" containerName="ovnkube-controller" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.842374 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b6e2d5a-0472-425b-b5b4-0b94f14ebfba" containerName="northd" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.842384 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b6e2d5a-0472-425b-b5b4-0b94f14ebfba" containerName="sbdb" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.842391 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b6e2d5a-0472-425b-b5b4-0b94f14ebfba" containerName="ovn-controller" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.842400 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b6e2d5a-0472-425b-b5b4-0b94f14ebfba" containerName="ovnkube-controller" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.842415 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b6e2d5a-0472-425b-b5b4-0b94f14ebfba" containerName="kube-rbac-proxy-node" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.842424 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b6e2d5a-0472-425b-b5b4-0b94f14ebfba" containerName="nbdb" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.842431 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b6e2d5a-0472-425b-b5b4-0b94f14ebfba" containerName="ovn-acl-logging" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.842437 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b6e2d5a-0472-425b-b5b4-0b94f14ebfba" containerName="ovnkube-controller" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.842445 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b6e2d5a-0472-425b-b5b4-0b94f14ebfba" containerName="ovnkube-controller" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.842452 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b6e2d5a-0472-425b-b5b4-0b94f14ebfba" containerName="ovnkube-controller" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.842458 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b6e2d5a-0472-425b-b5b4-0b94f14ebfba" containerName="kube-rbac-proxy-ovn-metrics" Feb 16 19:51:48 crc kubenswrapper[4675]: E0216 19:51:48.842557 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b6e2d5a-0472-425b-b5b4-0b94f14ebfba" containerName="ovnkube-controller" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.842564 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b6e2d5a-0472-425b-b5b4-0b94f14ebfba" containerName="ovnkube-controller" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.842658 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b6e2d5a-0472-425b-b5b4-0b94f14ebfba" containerName="ovnkube-controller" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.844430 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mgsc6" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.855821 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gpc5z_9b6e2d5a-0472-425b-b5b4-0b94f14ebfba/ovnkube-controller/3.log" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.860262 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gpc5z_9b6e2d5a-0472-425b-b5b4-0b94f14ebfba/ovn-acl-logging/0.log" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.861312 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gpc5z_9b6e2d5a-0472-425b-b5b4-0b94f14ebfba/ovn-controller/0.log" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.862171 4675 generic.go:334] "Generic (PLEG): container finished" podID="9b6e2d5a-0472-425b-b5b4-0b94f14ebfba" containerID="73209f6cc6e17a7237c040c1bc086b40bb5945de0483d1a3082546f8d16c26b1" exitCode=0 Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.862212 4675 generic.go:334] "Generic (PLEG): container finished" podID="9b6e2d5a-0472-425b-b5b4-0b94f14ebfba" containerID="d4a449e69885bb506900d7dbb14ee3ff9eb2d1188267babb04fbe0bb8f057c48" exitCode=0 Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.862228 4675 generic.go:334] "Generic (PLEG): container finished" podID="9b6e2d5a-0472-425b-b5b4-0b94f14ebfba" containerID="13af49ba7abbe43edde46130d212f6d76f30bab85a63aef18a84721887e29535" exitCode=0 Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.862242 4675 generic.go:334] "Generic (PLEG): container finished" podID="9b6e2d5a-0472-425b-b5b4-0b94f14ebfba" containerID="1e22795f3990f90b901e1ba026e5f95e72508dc865635d19ba881a291dbaed85" exitCode=0 Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.862257 4675 generic.go:334] "Generic (PLEG): container finished" podID="9b6e2d5a-0472-425b-b5b4-0b94f14ebfba" containerID="5282559b57497e15c0bd828702b4994e8d05f7c1a81ee44e582ce3bb3b64cbb8" exitCode=0 Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.862270 4675 generic.go:334] "Generic (PLEG): container finished" podID="9b6e2d5a-0472-425b-b5b4-0b94f14ebfba" containerID="958d71da673d02ed4c067f390a0a94c9f20052f6274ec975f9a4c3034efa7aaa" exitCode=0 Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.862283 4675 generic.go:334] "Generic (PLEG): container finished" podID="9b6e2d5a-0472-425b-b5b4-0b94f14ebfba" containerID="9296772284fdb3e0ac56ffd5e54f10e7fe1256ccb616d52f103479ebcc6e81f9" exitCode=143 Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.862297 4675 generic.go:334] "Generic (PLEG): container finished" podID="9b6e2d5a-0472-425b-b5b4-0b94f14ebfba" containerID="ace6ee814e8e429b6ec77e88a0bb2fff35cfbfd3497e7a4d7882e11f437ce5a4" exitCode=143 Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.862309 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.862379 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" event={"ID":"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba","Type":"ContainerDied","Data":"73209f6cc6e17a7237c040c1bc086b40bb5945de0483d1a3082546f8d16c26b1"} Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.862428 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" event={"ID":"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba","Type":"ContainerDied","Data":"d4a449e69885bb506900d7dbb14ee3ff9eb2d1188267babb04fbe0bb8f057c48"} Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.862453 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" event={"ID":"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba","Type":"ContainerDied","Data":"13af49ba7abbe43edde46130d212f6d76f30bab85a63aef18a84721887e29535"} Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.862472 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" event={"ID":"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba","Type":"ContainerDied","Data":"1e22795f3990f90b901e1ba026e5f95e72508dc865635d19ba881a291dbaed85"} Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.862494 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" event={"ID":"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba","Type":"ContainerDied","Data":"5282559b57497e15c0bd828702b4994e8d05f7c1a81ee44e582ce3bb3b64cbb8"} Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.862536 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" event={"ID":"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba","Type":"ContainerDied","Data":"958d71da673d02ed4c067f390a0a94c9f20052f6274ec975f9a4c3034efa7aaa"} Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.862537 4675 scope.go:117] "RemoveContainer" containerID="73209f6cc6e17a7237c040c1bc086b40bb5945de0483d1a3082546f8d16c26b1" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.862556 4675 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"07781eeef14d0ee02a533c5f1b5b7bdb9fa735d2a02636e9671605e69e6a932c"} Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.862574 4675 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d4a449e69885bb506900d7dbb14ee3ff9eb2d1188267babb04fbe0bb8f057c48"} Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.862589 4675 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"13af49ba7abbe43edde46130d212f6d76f30bab85a63aef18a84721887e29535"} Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.862601 4675 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1e22795f3990f90b901e1ba026e5f95e72508dc865635d19ba881a291dbaed85"} Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.862612 4675 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5282559b57497e15c0bd828702b4994e8d05f7c1a81ee44e582ce3bb3b64cbb8"} Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.862623 4675 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"958d71da673d02ed4c067f390a0a94c9f20052f6274ec975f9a4c3034efa7aaa"} Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.862634 4675 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9296772284fdb3e0ac56ffd5e54f10e7fe1256ccb616d52f103479ebcc6e81f9"} Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.862646 4675 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ace6ee814e8e429b6ec77e88a0bb2fff35cfbfd3497e7a4d7882e11f437ce5a4"} Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.862656 4675 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5"} Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.862672 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" event={"ID":"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba","Type":"ContainerDied","Data":"9296772284fdb3e0ac56ffd5e54f10e7fe1256ccb616d52f103479ebcc6e81f9"} Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.862718 4675 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"73209f6cc6e17a7237c040c1bc086b40bb5945de0483d1a3082546f8d16c26b1"} Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.862731 4675 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"07781eeef14d0ee02a533c5f1b5b7bdb9fa735d2a02636e9671605e69e6a932c"} Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.862748 4675 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d4a449e69885bb506900d7dbb14ee3ff9eb2d1188267babb04fbe0bb8f057c48"} Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.862760 4675 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"13af49ba7abbe43edde46130d212f6d76f30bab85a63aef18a84721887e29535"} Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.862772 4675 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1e22795f3990f90b901e1ba026e5f95e72508dc865635d19ba881a291dbaed85"} Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.862784 4675 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5282559b57497e15c0bd828702b4994e8d05f7c1a81ee44e582ce3bb3b64cbb8"} Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.862794 4675 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"958d71da673d02ed4c067f390a0a94c9f20052f6274ec975f9a4c3034efa7aaa"} Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.862805 4675 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9296772284fdb3e0ac56ffd5e54f10e7fe1256ccb616d52f103479ebcc6e81f9"} Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.862815 4675 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ace6ee814e8e429b6ec77e88a0bb2fff35cfbfd3497e7a4d7882e11f437ce5a4"} Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.862825 4675 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5"} Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.862840 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" event={"ID":"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba","Type":"ContainerDied","Data":"ace6ee814e8e429b6ec77e88a0bb2fff35cfbfd3497e7a4d7882e11f437ce5a4"} Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.862858 4675 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"73209f6cc6e17a7237c040c1bc086b40bb5945de0483d1a3082546f8d16c26b1"} Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.862870 4675 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"07781eeef14d0ee02a533c5f1b5b7bdb9fa735d2a02636e9671605e69e6a932c"} Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.862881 4675 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d4a449e69885bb506900d7dbb14ee3ff9eb2d1188267babb04fbe0bb8f057c48"} Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.862893 4675 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"13af49ba7abbe43edde46130d212f6d76f30bab85a63aef18a84721887e29535"} Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.862904 4675 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1e22795f3990f90b901e1ba026e5f95e72508dc865635d19ba881a291dbaed85"} Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.862914 4675 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5282559b57497e15c0bd828702b4994e8d05f7c1a81ee44e582ce3bb3b64cbb8"} Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.862925 4675 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"958d71da673d02ed4c067f390a0a94c9f20052f6274ec975f9a4c3034efa7aaa"} Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.862935 4675 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9296772284fdb3e0ac56ffd5e54f10e7fe1256ccb616d52f103479ebcc6e81f9"} Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.862945 4675 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ace6ee814e8e429b6ec77e88a0bb2fff35cfbfd3497e7a4d7882e11f437ce5a4"} Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.862955 4675 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5"} Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.862969 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gpc5z" event={"ID":"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba","Type":"ContainerDied","Data":"9e64992c36b7a5261c136d3803e96d18254b5229a957b52f26cfbb1709838f65"} Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.862984 4675 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"73209f6cc6e17a7237c040c1bc086b40bb5945de0483d1a3082546f8d16c26b1"} Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.862996 4675 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"07781eeef14d0ee02a533c5f1b5b7bdb9fa735d2a02636e9671605e69e6a932c"} Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.863007 4675 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d4a449e69885bb506900d7dbb14ee3ff9eb2d1188267babb04fbe0bb8f057c48"} Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.863018 4675 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"13af49ba7abbe43edde46130d212f6d76f30bab85a63aef18a84721887e29535"} Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.863029 4675 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1e22795f3990f90b901e1ba026e5f95e72508dc865635d19ba881a291dbaed85"} Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.863040 4675 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5282559b57497e15c0bd828702b4994e8d05f7c1a81ee44e582ce3bb3b64cbb8"} Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.863050 4675 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"958d71da673d02ed4c067f390a0a94c9f20052f6274ec975f9a4c3034efa7aaa"} Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.863062 4675 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9296772284fdb3e0ac56ffd5e54f10e7fe1256ccb616d52f103479ebcc6e81f9"} Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.863073 4675 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ace6ee814e8e429b6ec77e88a0bb2fff35cfbfd3497e7a4d7882e11f437ce5a4"} Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.863083 4675 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5"} Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.867127 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pj5xg_c9a99563-d631-455f-8464-160e5619c610/kube-multus/2.log" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.867989 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pj5xg_c9a99563-d631-455f-8464-160e5619c610/kube-multus/1.log" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.868060 4675 generic.go:334] "Generic (PLEG): container finished" podID="c9a99563-d631-455f-8464-160e5619c610" containerID="d0d4149182b358057857443c048c0c9d3c148645a2efd00ff51712f6b4d3fc01" exitCode=2 Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.868148 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pj5xg" event={"ID":"c9a99563-d631-455f-8464-160e5619c610","Type":"ContainerDied","Data":"d0d4149182b358057857443c048c0c9d3c148645a2efd00ff51712f6b4d3fc01"} Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.868187 4675 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4e43206e68596f3aac0d8c5d9c093d1b5aa57306577c54fc003ab051950ceeac"} Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.869019 4675 scope.go:117] "RemoveContainer" containerID="d0d4149182b358057857443c048c0c9d3c148645a2efd00ff51712f6b4d3fc01" Feb 16 19:51:48 crc kubenswrapper[4675]: E0216 19:51:48.869956 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-pj5xg_openshift-multus(c9a99563-d631-455f-8464-160e5619c610)\"" pod="openshift-multus/multus-pj5xg" podUID="c9a99563-d631-455f-8464-160e5619c610" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.880656 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" event={"ID":"10414964-83d0-4d95-a89f-e3212a8015b5","Type":"ContainerStarted","Data":"55a9711c5328ff5f5c11f4b1a3351942f6dbd223ca91ebbc69660678f4cb0497"} Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.889995 4675 scope.go:117] "RemoveContainer" containerID="07781eeef14d0ee02a533c5f1b5b7bdb9fa735d2a02636e9671605e69e6a932c" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.918800 4675 scope.go:117] "RemoveContainer" containerID="d4a449e69885bb506900d7dbb14ee3ff9eb2d1188267babb04fbe0bb8f057c48" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.921106 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-hmxhx" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.923054 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-host-run-ovn-kubernetes\") pod \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\" (UID: \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\") " Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.923123 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-ovnkube-config\") pod \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\" (UID: \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\") " Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.923161 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-log-socket\") pod \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\" (UID: \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\") " Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.923203 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-run-ovn\") pod \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\" (UID: \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\") " Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.923239 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-host-run-netns\") pod \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\" (UID: \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\") " Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.923270 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-ovnkube-script-lib\") pod \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\" (UID: \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\") " Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.923272 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "9b6e2d5a-0472-425b-b5b4-0b94f14ebfba" (UID: "9b6e2d5a-0472-425b-b5b4-0b94f14ebfba"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.923322 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-host-var-lib-cni-networks-ovn-kubernetes\") pod \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\" (UID: \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\") " Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.923355 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "9b6e2d5a-0472-425b-b5b4-0b94f14ebfba" (UID: "9b6e2d5a-0472-425b-b5b4-0b94f14ebfba"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.923373 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "9b6e2d5a-0472-425b-b5b4-0b94f14ebfba" (UID: "9b6e2d5a-0472-425b-b5b4-0b94f14ebfba"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.923415 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-log-socket" (OuterVolumeSpecName: "log-socket") pod "9b6e2d5a-0472-425b-b5b4-0b94f14ebfba" (UID: "9b6e2d5a-0472-425b-b5b4-0b94f14ebfba"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.923480 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-run-openvswitch\") pod \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\" (UID: \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\") " Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.923508 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "9b6e2d5a-0472-425b-b5b4-0b94f14ebfba" (UID: "9b6e2d5a-0472-425b-b5b4-0b94f14ebfba"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.923541 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-var-lib-openvswitch\") pod \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\" (UID: \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\") " Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.923585 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "9b6e2d5a-0472-425b-b5b4-0b94f14ebfba" (UID: "9b6e2d5a-0472-425b-b5b4-0b94f14ebfba"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.923627 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "9b6e2d5a-0472-425b-b5b4-0b94f14ebfba" (UID: "9b6e2d5a-0472-425b-b5b4-0b94f14ebfba"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.923669 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-node-log\") pod \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\" (UID: \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\") " Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.923768 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-run-systemd\") pod \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\" (UID: \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\") " Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.923809 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-systemd-units\") pod \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\" (UID: \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\") " Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.923856 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-host-cni-netd\") pod \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\" (UID: \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\") " Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.923859 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "9b6e2d5a-0472-425b-b5b4-0b94f14ebfba" (UID: "9b6e2d5a-0472-425b-b5b4-0b94f14ebfba"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.923914 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbc5j\" (UniqueName: \"kubernetes.io/projected/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-kube-api-access-wbc5j\") pod \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\" (UID: \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\") " Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.923990 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-host-slash\") pod \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\" (UID: \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\") " Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.924062 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-host-slash" (OuterVolumeSpecName: "host-slash") pod "9b6e2d5a-0472-425b-b5b4-0b94f14ebfba" (UID: "9b6e2d5a-0472-425b-b5b4-0b94f14ebfba"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.924230 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-env-overrides\") pod \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\" (UID: \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\") " Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.924614 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "9b6e2d5a-0472-425b-b5b4-0b94f14ebfba" (UID: "9b6e2d5a-0472-425b-b5b4-0b94f14ebfba"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.924753 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-node-log" (OuterVolumeSpecName: "node-log") pod "9b6e2d5a-0472-425b-b5b4-0b94f14ebfba" (UID: "9b6e2d5a-0472-425b-b5b4-0b94f14ebfba"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.924925 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "9b6e2d5a-0472-425b-b5b4-0b94f14ebfba" (UID: "9b6e2d5a-0472-425b-b5b4-0b94f14ebfba"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.925080 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-host-kubelet\") pod \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\" (UID: \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\") " Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.925146 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "9b6e2d5a-0472-425b-b5b4-0b94f14ebfba" (UID: "9b6e2d5a-0472-425b-b5b4-0b94f14ebfba"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.925176 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "9b6e2d5a-0472-425b-b5b4-0b94f14ebfba" (UID: "9b6e2d5a-0472-425b-b5b4-0b94f14ebfba"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.925511 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-ovn-node-metrics-cert\") pod \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\" (UID: \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\") " Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.925612 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-etc-openvswitch\") pod \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\" (UID: \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\") " Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.925667 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-host-cni-bin\") pod \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\" (UID: \"9b6e2d5a-0472-425b-b5b4-0b94f14ebfba\") " Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.925764 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "9b6e2d5a-0472-425b-b5b4-0b94f14ebfba" (UID: "9b6e2d5a-0472-425b-b5b4-0b94f14ebfba"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.926229 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e3b433a4-870c-4bb8-b588-614bbec134cd-env-overrides\") pod \"ovnkube-node-mgsc6\" (UID: \"e3b433a4-870c-4bb8-b588-614bbec134cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgsc6" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.926495 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e3b433a4-870c-4bb8-b588-614bbec134cd-ovnkube-script-lib\") pod \"ovnkube-node-mgsc6\" (UID: \"e3b433a4-870c-4bb8-b588-614bbec134cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgsc6" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.926565 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e3b433a4-870c-4bb8-b588-614bbec134cd-log-socket\") pod \"ovnkube-node-mgsc6\" (UID: \"e3b433a4-870c-4bb8-b588-614bbec134cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgsc6" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.926601 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "9b6e2d5a-0472-425b-b5b4-0b94f14ebfba" (UID: "9b6e2d5a-0472-425b-b5b4-0b94f14ebfba"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.926612 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e3b433a4-870c-4bb8-b588-614bbec134cd-host-cni-bin\") pod \"ovnkube-node-mgsc6\" (UID: \"e3b433a4-870c-4bb8-b588-614bbec134cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgsc6" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.926660 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e3b433a4-870c-4bb8-b588-614bbec134cd-host-run-netns\") pod \"ovnkube-node-mgsc6\" (UID: \"e3b433a4-870c-4bb8-b588-614bbec134cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgsc6" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.927378 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e3b433a4-870c-4bb8-b588-614bbec134cd-host-cni-netd\") pod \"ovnkube-node-mgsc6\" (UID: \"e3b433a4-870c-4bb8-b588-614bbec134cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgsc6" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.927459 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e3b433a4-870c-4bb8-b588-614bbec134cd-run-systemd\") pod \"ovnkube-node-mgsc6\" (UID: \"e3b433a4-870c-4bb8-b588-614bbec134cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgsc6" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.927533 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e3b433a4-870c-4bb8-b588-614bbec134cd-ovn-node-metrics-cert\") pod \"ovnkube-node-mgsc6\" (UID: \"e3b433a4-870c-4bb8-b588-614bbec134cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgsc6" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.927623 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e3b433a4-870c-4bb8-b588-614bbec134cd-etc-openvswitch\") pod \"ovnkube-node-mgsc6\" (UID: \"e3b433a4-870c-4bb8-b588-614bbec134cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgsc6" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.927737 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e3b433a4-870c-4bb8-b588-614bbec134cd-ovnkube-config\") pod \"ovnkube-node-mgsc6\" (UID: \"e3b433a4-870c-4bb8-b588-614bbec134cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgsc6" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.928129 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e3b433a4-870c-4bb8-b588-614bbec134cd-systemd-units\") pod \"ovnkube-node-mgsc6\" (UID: \"e3b433a4-870c-4bb8-b588-614bbec134cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgsc6" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.928199 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e3b433a4-870c-4bb8-b588-614bbec134cd-host-slash\") pod \"ovnkube-node-mgsc6\" (UID: \"e3b433a4-870c-4bb8-b588-614bbec134cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgsc6" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.928252 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e3b433a4-870c-4bb8-b588-614bbec134cd-host-run-ovn-kubernetes\") pod \"ovnkube-node-mgsc6\" (UID: \"e3b433a4-870c-4bb8-b588-614bbec134cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgsc6" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.928279 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e3b433a4-870c-4bb8-b588-614bbec134cd-run-ovn\") pod \"ovnkube-node-mgsc6\" (UID: \"e3b433a4-870c-4bb8-b588-614bbec134cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgsc6" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.928319 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e3b433a4-870c-4bb8-b588-614bbec134cd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mgsc6\" (UID: \"e3b433a4-870c-4bb8-b588-614bbec134cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgsc6" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.928359 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e3b433a4-870c-4bb8-b588-614bbec134cd-var-lib-openvswitch\") pod \"ovnkube-node-mgsc6\" (UID: \"e3b433a4-870c-4bb8-b588-614bbec134cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgsc6" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.928385 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e3b433a4-870c-4bb8-b588-614bbec134cd-run-openvswitch\") pod \"ovnkube-node-mgsc6\" (UID: \"e3b433a4-870c-4bb8-b588-614bbec134cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgsc6" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.928407 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtqkq\" (UniqueName: \"kubernetes.io/projected/e3b433a4-870c-4bb8-b588-614bbec134cd-kube-api-access-xtqkq\") pod \"ovnkube-node-mgsc6\" (UID: \"e3b433a4-870c-4bb8-b588-614bbec134cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgsc6" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.928429 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e3b433a4-870c-4bb8-b588-614bbec134cd-node-log\") pod \"ovnkube-node-mgsc6\" (UID: \"e3b433a4-870c-4bb8-b588-614bbec134cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgsc6" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.928484 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e3b433a4-870c-4bb8-b588-614bbec134cd-host-kubelet\") pod \"ovnkube-node-mgsc6\" (UID: \"e3b433a4-870c-4bb8-b588-614bbec134cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgsc6" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.929019 4675 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.929404 4675 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.929462 4675 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-run-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.929485 4675 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.929506 4675 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-node-log\") on node \"crc\" DevicePath \"\"" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.929527 4675 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-systemd-units\") on node \"crc\" DevicePath \"\"" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.929574 4675 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-host-cni-netd\") on node \"crc\" DevicePath \"\"" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.929595 4675 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-host-slash\") on node \"crc\" DevicePath \"\"" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.929623 4675 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-host-kubelet\") on node \"crc\" DevicePath \"\"" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.929641 4675 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.929654 4675 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-host-cni-bin\") on node \"crc\" DevicePath \"\"" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.929669 4675 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.929705 4675 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.929719 4675 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-log-socket\") on node \"crc\" DevicePath \"\"" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.929731 4675 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.929746 4675 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-host-run-netns\") on node \"crc\" DevicePath \"\"" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.934655 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "9b6e2d5a-0472-425b-b5b4-0b94f14ebfba" (UID: "9b6e2d5a-0472-425b-b5b4-0b94f14ebfba"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.936345 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "9b6e2d5a-0472-425b-b5b4-0b94f14ebfba" (UID: "9b6e2d5a-0472-425b-b5b4-0b94f14ebfba"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.936375 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-kube-api-access-wbc5j" (OuterVolumeSpecName: "kube-api-access-wbc5j") pod "9b6e2d5a-0472-425b-b5b4-0b94f14ebfba" (UID: "9b6e2d5a-0472-425b-b5b4-0b94f14ebfba"). InnerVolumeSpecName "kube-api-access-wbc5j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.946843 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "9b6e2d5a-0472-425b-b5b4-0b94f14ebfba" (UID: "9b6e2d5a-0472-425b-b5b4-0b94f14ebfba"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 16 19:51:48 crc kubenswrapper[4675]: I0216 19:51:48.991855 4675 scope.go:117] "RemoveContainer" containerID="13af49ba7abbe43edde46130d212f6d76f30bab85a63aef18a84721887e29535" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.008492 4675 scope.go:117] "RemoveContainer" containerID="1e22795f3990f90b901e1ba026e5f95e72508dc865635d19ba881a291dbaed85" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.021856 4675 scope.go:117] "RemoveContainer" containerID="5282559b57497e15c0bd828702b4994e8d05f7c1a81ee44e582ce3bb3b64cbb8" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.030994 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e3b433a4-870c-4bb8-b588-614bbec134cd-ovnkube-script-lib\") pod \"ovnkube-node-mgsc6\" (UID: \"e3b433a4-870c-4bb8-b588-614bbec134cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgsc6" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.031034 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e3b433a4-870c-4bb8-b588-614bbec134cd-host-cni-bin\") pod \"ovnkube-node-mgsc6\" (UID: \"e3b433a4-870c-4bb8-b588-614bbec134cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgsc6" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.031056 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e3b433a4-870c-4bb8-b588-614bbec134cd-log-socket\") pod \"ovnkube-node-mgsc6\" (UID: \"e3b433a4-870c-4bb8-b588-614bbec134cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgsc6" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.031079 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e3b433a4-870c-4bb8-b588-614bbec134cd-host-run-netns\") pod \"ovnkube-node-mgsc6\" (UID: \"e3b433a4-870c-4bb8-b588-614bbec134cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgsc6" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.031102 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e3b433a4-870c-4bb8-b588-614bbec134cd-host-cni-netd\") pod \"ovnkube-node-mgsc6\" (UID: \"e3b433a4-870c-4bb8-b588-614bbec134cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgsc6" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.031176 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e3b433a4-870c-4bb8-b588-614bbec134cd-host-run-netns\") pod \"ovnkube-node-mgsc6\" (UID: \"e3b433a4-870c-4bb8-b588-614bbec134cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgsc6" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.031178 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e3b433a4-870c-4bb8-b588-614bbec134cd-host-cni-bin\") pod \"ovnkube-node-mgsc6\" (UID: \"e3b433a4-870c-4bb8-b588-614bbec134cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgsc6" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.031178 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e3b433a4-870c-4bb8-b588-614bbec134cd-log-socket\") pod \"ovnkube-node-mgsc6\" (UID: \"e3b433a4-870c-4bb8-b588-614bbec134cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgsc6" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.031273 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e3b433a4-870c-4bb8-b588-614bbec134cd-run-systemd\") pod \"ovnkube-node-mgsc6\" (UID: \"e3b433a4-870c-4bb8-b588-614bbec134cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgsc6" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.031297 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e3b433a4-870c-4bb8-b588-614bbec134cd-ovn-node-metrics-cert\") pod \"ovnkube-node-mgsc6\" (UID: \"e3b433a4-870c-4bb8-b588-614bbec134cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgsc6" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.031342 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e3b433a4-870c-4bb8-b588-614bbec134cd-run-systemd\") pod \"ovnkube-node-mgsc6\" (UID: \"e3b433a4-870c-4bb8-b588-614bbec134cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgsc6" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.031364 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e3b433a4-870c-4bb8-b588-614bbec134cd-host-cni-netd\") pod \"ovnkube-node-mgsc6\" (UID: \"e3b433a4-870c-4bb8-b588-614bbec134cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgsc6" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.031403 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e3b433a4-870c-4bb8-b588-614bbec134cd-etc-openvswitch\") pod \"ovnkube-node-mgsc6\" (UID: \"e3b433a4-870c-4bb8-b588-614bbec134cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgsc6" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.031379 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e3b433a4-870c-4bb8-b588-614bbec134cd-etc-openvswitch\") pod \"ovnkube-node-mgsc6\" (UID: \"e3b433a4-870c-4bb8-b588-614bbec134cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgsc6" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.031524 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e3b433a4-870c-4bb8-b588-614bbec134cd-ovnkube-config\") pod \"ovnkube-node-mgsc6\" (UID: \"e3b433a4-870c-4bb8-b588-614bbec134cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgsc6" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.031583 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e3b433a4-870c-4bb8-b588-614bbec134cd-systemd-units\") pod \"ovnkube-node-mgsc6\" (UID: \"e3b433a4-870c-4bb8-b588-614bbec134cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgsc6" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.031627 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e3b433a4-870c-4bb8-b588-614bbec134cd-host-slash\") pod \"ovnkube-node-mgsc6\" (UID: \"e3b433a4-870c-4bb8-b588-614bbec134cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgsc6" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.031682 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e3b433a4-870c-4bb8-b588-614bbec134cd-host-run-ovn-kubernetes\") pod \"ovnkube-node-mgsc6\" (UID: \"e3b433a4-870c-4bb8-b588-614bbec134cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgsc6" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.031733 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e3b433a4-870c-4bb8-b588-614bbec134cd-run-ovn\") pod \"ovnkube-node-mgsc6\" (UID: \"e3b433a4-870c-4bb8-b588-614bbec134cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgsc6" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.031771 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e3b433a4-870c-4bb8-b588-614bbec134cd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mgsc6\" (UID: \"e3b433a4-870c-4bb8-b588-614bbec134cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgsc6" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.031802 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e3b433a4-870c-4bb8-b588-614bbec134cd-var-lib-openvswitch\") pod \"ovnkube-node-mgsc6\" (UID: \"e3b433a4-870c-4bb8-b588-614bbec134cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgsc6" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.031828 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e3b433a4-870c-4bb8-b588-614bbec134cd-ovnkube-script-lib\") pod \"ovnkube-node-mgsc6\" (UID: \"e3b433a4-870c-4bb8-b588-614bbec134cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgsc6" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.031836 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e3b433a4-870c-4bb8-b588-614bbec134cd-run-openvswitch\") pod \"ovnkube-node-mgsc6\" (UID: \"e3b433a4-870c-4bb8-b588-614bbec134cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgsc6" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.031874 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e3b433a4-870c-4bb8-b588-614bbec134cd-run-openvswitch\") pod \"ovnkube-node-mgsc6\" (UID: \"e3b433a4-870c-4bb8-b588-614bbec134cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgsc6" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.031916 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e3b433a4-870c-4bb8-b588-614bbec134cd-node-log\") pod \"ovnkube-node-mgsc6\" (UID: \"e3b433a4-870c-4bb8-b588-614bbec134cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgsc6" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.031947 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e3b433a4-870c-4bb8-b588-614bbec134cd-systemd-units\") pod \"ovnkube-node-mgsc6\" (UID: \"e3b433a4-870c-4bb8-b588-614bbec134cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgsc6" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.031949 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e3b433a4-870c-4bb8-b588-614bbec134cd-host-slash\") pod \"ovnkube-node-mgsc6\" (UID: \"e3b433a4-870c-4bb8-b588-614bbec134cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgsc6" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.031974 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e3b433a4-870c-4bb8-b588-614bbec134cd-var-lib-openvswitch\") pod \"ovnkube-node-mgsc6\" (UID: \"e3b433a4-870c-4bb8-b588-614bbec134cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgsc6" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.031880 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e3b433a4-870c-4bb8-b588-614bbec134cd-node-log\") pod \"ovnkube-node-mgsc6\" (UID: \"e3b433a4-870c-4bb8-b588-614bbec134cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgsc6" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.032004 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e3b433a4-870c-4bb8-b588-614bbec134cd-run-ovn\") pod \"ovnkube-node-mgsc6\" (UID: \"e3b433a4-870c-4bb8-b588-614bbec134cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgsc6" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.032021 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtqkq\" (UniqueName: \"kubernetes.io/projected/e3b433a4-870c-4bb8-b588-614bbec134cd-kube-api-access-xtqkq\") pod \"ovnkube-node-mgsc6\" (UID: \"e3b433a4-870c-4bb8-b588-614bbec134cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgsc6" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.032054 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e3b433a4-870c-4bb8-b588-614bbec134cd-host-kubelet\") pod \"ovnkube-node-mgsc6\" (UID: \"e3b433a4-870c-4bb8-b588-614bbec134cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgsc6" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.032089 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e3b433a4-870c-4bb8-b588-614bbec134cd-env-overrides\") pod \"ovnkube-node-mgsc6\" (UID: \"e3b433a4-870c-4bb8-b588-614bbec134cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgsc6" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.031916 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e3b433a4-870c-4bb8-b588-614bbec134cd-host-run-ovn-kubernetes\") pod \"ovnkube-node-mgsc6\" (UID: \"e3b433a4-870c-4bb8-b588-614bbec134cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgsc6" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.032119 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e3b433a4-870c-4bb8-b588-614bbec134cd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mgsc6\" (UID: \"e3b433a4-870c-4bb8-b588-614bbec134cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgsc6" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.032192 4675 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-run-systemd\") on node \"crc\" DevicePath \"\"" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.032208 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wbc5j\" (UniqueName: \"kubernetes.io/projected/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-kube-api-access-wbc5j\") on node \"crc\" DevicePath \"\"" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.032220 4675 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.032230 4675 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.032268 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e3b433a4-870c-4bb8-b588-614bbec134cd-ovnkube-config\") pod \"ovnkube-node-mgsc6\" (UID: \"e3b433a4-870c-4bb8-b588-614bbec134cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgsc6" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.032313 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e3b433a4-870c-4bb8-b588-614bbec134cd-host-kubelet\") pod \"ovnkube-node-mgsc6\" (UID: \"e3b433a4-870c-4bb8-b588-614bbec134cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgsc6" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.032886 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e3b433a4-870c-4bb8-b588-614bbec134cd-env-overrides\") pod \"ovnkube-node-mgsc6\" (UID: \"e3b433a4-870c-4bb8-b588-614bbec134cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgsc6" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.038958 4675 scope.go:117] "RemoveContainer" containerID="958d71da673d02ed4c067f390a0a94c9f20052f6274ec975f9a4c3034efa7aaa" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.039286 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e3b433a4-870c-4bb8-b588-614bbec134cd-ovn-node-metrics-cert\") pod \"ovnkube-node-mgsc6\" (UID: \"e3b433a4-870c-4bb8-b588-614bbec134cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgsc6" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.049636 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtqkq\" (UniqueName: \"kubernetes.io/projected/e3b433a4-870c-4bb8-b588-614bbec134cd-kube-api-access-xtqkq\") pod \"ovnkube-node-mgsc6\" (UID: \"e3b433a4-870c-4bb8-b588-614bbec134cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgsc6" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.067622 4675 scope.go:117] "RemoveContainer" containerID="9296772284fdb3e0ac56ffd5e54f10e7fe1256ccb616d52f103479ebcc6e81f9" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.081068 4675 scope.go:117] "RemoveContainer" containerID="ace6ee814e8e429b6ec77e88a0bb2fff35cfbfd3497e7a4d7882e11f437ce5a4" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.094213 4675 scope.go:117] "RemoveContainer" containerID="87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.108942 4675 scope.go:117] "RemoveContainer" containerID="73209f6cc6e17a7237c040c1bc086b40bb5945de0483d1a3082546f8d16c26b1" Feb 16 19:51:49 crc kubenswrapper[4675]: E0216 19:51:49.109342 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73209f6cc6e17a7237c040c1bc086b40bb5945de0483d1a3082546f8d16c26b1\": container with ID starting with 73209f6cc6e17a7237c040c1bc086b40bb5945de0483d1a3082546f8d16c26b1 not found: ID does not exist" containerID="73209f6cc6e17a7237c040c1bc086b40bb5945de0483d1a3082546f8d16c26b1" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.109395 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73209f6cc6e17a7237c040c1bc086b40bb5945de0483d1a3082546f8d16c26b1"} err="failed to get container status \"73209f6cc6e17a7237c040c1bc086b40bb5945de0483d1a3082546f8d16c26b1\": rpc error: code = NotFound desc = could not find container \"73209f6cc6e17a7237c040c1bc086b40bb5945de0483d1a3082546f8d16c26b1\": container with ID starting with 73209f6cc6e17a7237c040c1bc086b40bb5945de0483d1a3082546f8d16c26b1 not found: ID does not exist" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.109434 4675 scope.go:117] "RemoveContainer" containerID="07781eeef14d0ee02a533c5f1b5b7bdb9fa735d2a02636e9671605e69e6a932c" Feb 16 19:51:49 crc kubenswrapper[4675]: E0216 19:51:49.109861 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07781eeef14d0ee02a533c5f1b5b7bdb9fa735d2a02636e9671605e69e6a932c\": container with ID starting with 07781eeef14d0ee02a533c5f1b5b7bdb9fa735d2a02636e9671605e69e6a932c not found: ID does not exist" containerID="07781eeef14d0ee02a533c5f1b5b7bdb9fa735d2a02636e9671605e69e6a932c" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.109892 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07781eeef14d0ee02a533c5f1b5b7bdb9fa735d2a02636e9671605e69e6a932c"} err="failed to get container status \"07781eeef14d0ee02a533c5f1b5b7bdb9fa735d2a02636e9671605e69e6a932c\": rpc error: code = NotFound desc = could not find container \"07781eeef14d0ee02a533c5f1b5b7bdb9fa735d2a02636e9671605e69e6a932c\": container with ID starting with 07781eeef14d0ee02a533c5f1b5b7bdb9fa735d2a02636e9671605e69e6a932c not found: ID does not exist" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.109910 4675 scope.go:117] "RemoveContainer" containerID="d4a449e69885bb506900d7dbb14ee3ff9eb2d1188267babb04fbe0bb8f057c48" Feb 16 19:51:49 crc kubenswrapper[4675]: E0216 19:51:49.110357 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4a449e69885bb506900d7dbb14ee3ff9eb2d1188267babb04fbe0bb8f057c48\": container with ID starting with d4a449e69885bb506900d7dbb14ee3ff9eb2d1188267babb04fbe0bb8f057c48 not found: ID does not exist" containerID="d4a449e69885bb506900d7dbb14ee3ff9eb2d1188267babb04fbe0bb8f057c48" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.110438 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4a449e69885bb506900d7dbb14ee3ff9eb2d1188267babb04fbe0bb8f057c48"} err="failed to get container status \"d4a449e69885bb506900d7dbb14ee3ff9eb2d1188267babb04fbe0bb8f057c48\": rpc error: code = NotFound desc = could not find container \"d4a449e69885bb506900d7dbb14ee3ff9eb2d1188267babb04fbe0bb8f057c48\": container with ID starting with d4a449e69885bb506900d7dbb14ee3ff9eb2d1188267babb04fbe0bb8f057c48 not found: ID does not exist" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.110496 4675 scope.go:117] "RemoveContainer" containerID="13af49ba7abbe43edde46130d212f6d76f30bab85a63aef18a84721887e29535" Feb 16 19:51:49 crc kubenswrapper[4675]: E0216 19:51:49.110915 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13af49ba7abbe43edde46130d212f6d76f30bab85a63aef18a84721887e29535\": container with ID starting with 13af49ba7abbe43edde46130d212f6d76f30bab85a63aef18a84721887e29535 not found: ID does not exist" containerID="13af49ba7abbe43edde46130d212f6d76f30bab85a63aef18a84721887e29535" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.110983 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13af49ba7abbe43edde46130d212f6d76f30bab85a63aef18a84721887e29535"} err="failed to get container status \"13af49ba7abbe43edde46130d212f6d76f30bab85a63aef18a84721887e29535\": rpc error: code = NotFound desc = could not find container \"13af49ba7abbe43edde46130d212f6d76f30bab85a63aef18a84721887e29535\": container with ID starting with 13af49ba7abbe43edde46130d212f6d76f30bab85a63aef18a84721887e29535 not found: ID does not exist" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.111023 4675 scope.go:117] "RemoveContainer" containerID="1e22795f3990f90b901e1ba026e5f95e72508dc865635d19ba881a291dbaed85" Feb 16 19:51:49 crc kubenswrapper[4675]: E0216 19:51:49.111457 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e22795f3990f90b901e1ba026e5f95e72508dc865635d19ba881a291dbaed85\": container with ID starting with 1e22795f3990f90b901e1ba026e5f95e72508dc865635d19ba881a291dbaed85 not found: ID does not exist" containerID="1e22795f3990f90b901e1ba026e5f95e72508dc865635d19ba881a291dbaed85" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.111519 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e22795f3990f90b901e1ba026e5f95e72508dc865635d19ba881a291dbaed85"} err="failed to get container status \"1e22795f3990f90b901e1ba026e5f95e72508dc865635d19ba881a291dbaed85\": rpc error: code = NotFound desc = could not find container \"1e22795f3990f90b901e1ba026e5f95e72508dc865635d19ba881a291dbaed85\": container with ID starting with 1e22795f3990f90b901e1ba026e5f95e72508dc865635d19ba881a291dbaed85 not found: ID does not exist" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.111560 4675 scope.go:117] "RemoveContainer" containerID="5282559b57497e15c0bd828702b4994e8d05f7c1a81ee44e582ce3bb3b64cbb8" Feb 16 19:51:49 crc kubenswrapper[4675]: E0216 19:51:49.112020 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5282559b57497e15c0bd828702b4994e8d05f7c1a81ee44e582ce3bb3b64cbb8\": container with ID starting with 5282559b57497e15c0bd828702b4994e8d05f7c1a81ee44e582ce3bb3b64cbb8 not found: ID does not exist" containerID="5282559b57497e15c0bd828702b4994e8d05f7c1a81ee44e582ce3bb3b64cbb8" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.112068 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5282559b57497e15c0bd828702b4994e8d05f7c1a81ee44e582ce3bb3b64cbb8"} err="failed to get container status \"5282559b57497e15c0bd828702b4994e8d05f7c1a81ee44e582ce3bb3b64cbb8\": rpc error: code = NotFound desc = could not find container \"5282559b57497e15c0bd828702b4994e8d05f7c1a81ee44e582ce3bb3b64cbb8\": container with ID starting with 5282559b57497e15c0bd828702b4994e8d05f7c1a81ee44e582ce3bb3b64cbb8 not found: ID does not exist" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.112099 4675 scope.go:117] "RemoveContainer" containerID="958d71da673d02ed4c067f390a0a94c9f20052f6274ec975f9a4c3034efa7aaa" Feb 16 19:51:49 crc kubenswrapper[4675]: E0216 19:51:49.112452 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"958d71da673d02ed4c067f390a0a94c9f20052f6274ec975f9a4c3034efa7aaa\": container with ID starting with 958d71da673d02ed4c067f390a0a94c9f20052f6274ec975f9a4c3034efa7aaa not found: ID does not exist" containerID="958d71da673d02ed4c067f390a0a94c9f20052f6274ec975f9a4c3034efa7aaa" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.112486 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"958d71da673d02ed4c067f390a0a94c9f20052f6274ec975f9a4c3034efa7aaa"} err="failed to get container status \"958d71da673d02ed4c067f390a0a94c9f20052f6274ec975f9a4c3034efa7aaa\": rpc error: code = NotFound desc = could not find container \"958d71da673d02ed4c067f390a0a94c9f20052f6274ec975f9a4c3034efa7aaa\": container with ID starting with 958d71da673d02ed4c067f390a0a94c9f20052f6274ec975f9a4c3034efa7aaa not found: ID does not exist" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.112507 4675 scope.go:117] "RemoveContainer" containerID="9296772284fdb3e0ac56ffd5e54f10e7fe1256ccb616d52f103479ebcc6e81f9" Feb 16 19:51:49 crc kubenswrapper[4675]: E0216 19:51:49.112917 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9296772284fdb3e0ac56ffd5e54f10e7fe1256ccb616d52f103479ebcc6e81f9\": container with ID starting with 9296772284fdb3e0ac56ffd5e54f10e7fe1256ccb616d52f103479ebcc6e81f9 not found: ID does not exist" containerID="9296772284fdb3e0ac56ffd5e54f10e7fe1256ccb616d52f103479ebcc6e81f9" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.112968 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9296772284fdb3e0ac56ffd5e54f10e7fe1256ccb616d52f103479ebcc6e81f9"} err="failed to get container status \"9296772284fdb3e0ac56ffd5e54f10e7fe1256ccb616d52f103479ebcc6e81f9\": rpc error: code = NotFound desc = could not find container \"9296772284fdb3e0ac56ffd5e54f10e7fe1256ccb616d52f103479ebcc6e81f9\": container with ID starting with 9296772284fdb3e0ac56ffd5e54f10e7fe1256ccb616d52f103479ebcc6e81f9 not found: ID does not exist" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.112998 4675 scope.go:117] "RemoveContainer" containerID="ace6ee814e8e429b6ec77e88a0bb2fff35cfbfd3497e7a4d7882e11f437ce5a4" Feb 16 19:51:49 crc kubenswrapper[4675]: E0216 19:51:49.113336 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ace6ee814e8e429b6ec77e88a0bb2fff35cfbfd3497e7a4d7882e11f437ce5a4\": container with ID starting with ace6ee814e8e429b6ec77e88a0bb2fff35cfbfd3497e7a4d7882e11f437ce5a4 not found: ID does not exist" containerID="ace6ee814e8e429b6ec77e88a0bb2fff35cfbfd3497e7a4d7882e11f437ce5a4" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.113385 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ace6ee814e8e429b6ec77e88a0bb2fff35cfbfd3497e7a4d7882e11f437ce5a4"} err="failed to get container status \"ace6ee814e8e429b6ec77e88a0bb2fff35cfbfd3497e7a4d7882e11f437ce5a4\": rpc error: code = NotFound desc = could not find container \"ace6ee814e8e429b6ec77e88a0bb2fff35cfbfd3497e7a4d7882e11f437ce5a4\": container with ID starting with ace6ee814e8e429b6ec77e88a0bb2fff35cfbfd3497e7a4d7882e11f437ce5a4 not found: ID does not exist" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.113417 4675 scope.go:117] "RemoveContainer" containerID="87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5" Feb 16 19:51:49 crc kubenswrapper[4675]: E0216 19:51:49.113742 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5\": container with ID starting with 87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5 not found: ID does not exist" containerID="87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.113784 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5"} err="failed to get container status \"87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5\": rpc error: code = NotFound desc = could not find container \"87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5\": container with ID starting with 87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5 not found: ID does not exist" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.113814 4675 scope.go:117] "RemoveContainer" containerID="73209f6cc6e17a7237c040c1bc086b40bb5945de0483d1a3082546f8d16c26b1" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.114172 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73209f6cc6e17a7237c040c1bc086b40bb5945de0483d1a3082546f8d16c26b1"} err="failed to get container status \"73209f6cc6e17a7237c040c1bc086b40bb5945de0483d1a3082546f8d16c26b1\": rpc error: code = NotFound desc = could not find container \"73209f6cc6e17a7237c040c1bc086b40bb5945de0483d1a3082546f8d16c26b1\": container with ID starting with 73209f6cc6e17a7237c040c1bc086b40bb5945de0483d1a3082546f8d16c26b1 not found: ID does not exist" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.114244 4675 scope.go:117] "RemoveContainer" containerID="07781eeef14d0ee02a533c5f1b5b7bdb9fa735d2a02636e9671605e69e6a932c" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.114580 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07781eeef14d0ee02a533c5f1b5b7bdb9fa735d2a02636e9671605e69e6a932c"} err="failed to get container status \"07781eeef14d0ee02a533c5f1b5b7bdb9fa735d2a02636e9671605e69e6a932c\": rpc error: code = NotFound desc = could not find container \"07781eeef14d0ee02a533c5f1b5b7bdb9fa735d2a02636e9671605e69e6a932c\": container with ID starting with 07781eeef14d0ee02a533c5f1b5b7bdb9fa735d2a02636e9671605e69e6a932c not found: ID does not exist" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.114677 4675 scope.go:117] "RemoveContainer" containerID="d4a449e69885bb506900d7dbb14ee3ff9eb2d1188267babb04fbe0bb8f057c48" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.115081 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4a449e69885bb506900d7dbb14ee3ff9eb2d1188267babb04fbe0bb8f057c48"} err="failed to get container status \"d4a449e69885bb506900d7dbb14ee3ff9eb2d1188267babb04fbe0bb8f057c48\": rpc error: code = NotFound desc = could not find container \"d4a449e69885bb506900d7dbb14ee3ff9eb2d1188267babb04fbe0bb8f057c48\": container with ID starting with d4a449e69885bb506900d7dbb14ee3ff9eb2d1188267babb04fbe0bb8f057c48 not found: ID does not exist" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.115112 4675 scope.go:117] "RemoveContainer" containerID="13af49ba7abbe43edde46130d212f6d76f30bab85a63aef18a84721887e29535" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.115362 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13af49ba7abbe43edde46130d212f6d76f30bab85a63aef18a84721887e29535"} err="failed to get container status \"13af49ba7abbe43edde46130d212f6d76f30bab85a63aef18a84721887e29535\": rpc error: code = NotFound desc = could not find container \"13af49ba7abbe43edde46130d212f6d76f30bab85a63aef18a84721887e29535\": container with ID starting with 13af49ba7abbe43edde46130d212f6d76f30bab85a63aef18a84721887e29535 not found: ID does not exist" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.115399 4675 scope.go:117] "RemoveContainer" containerID="1e22795f3990f90b901e1ba026e5f95e72508dc865635d19ba881a291dbaed85" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.115671 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e22795f3990f90b901e1ba026e5f95e72508dc865635d19ba881a291dbaed85"} err="failed to get container status \"1e22795f3990f90b901e1ba026e5f95e72508dc865635d19ba881a291dbaed85\": rpc error: code = NotFound desc = could not find container \"1e22795f3990f90b901e1ba026e5f95e72508dc865635d19ba881a291dbaed85\": container with ID starting with 1e22795f3990f90b901e1ba026e5f95e72508dc865635d19ba881a291dbaed85 not found: ID does not exist" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.115823 4675 scope.go:117] "RemoveContainer" containerID="5282559b57497e15c0bd828702b4994e8d05f7c1a81ee44e582ce3bb3b64cbb8" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.116513 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5282559b57497e15c0bd828702b4994e8d05f7c1a81ee44e582ce3bb3b64cbb8"} err="failed to get container status \"5282559b57497e15c0bd828702b4994e8d05f7c1a81ee44e582ce3bb3b64cbb8\": rpc error: code = NotFound desc = could not find container \"5282559b57497e15c0bd828702b4994e8d05f7c1a81ee44e582ce3bb3b64cbb8\": container with ID starting with 5282559b57497e15c0bd828702b4994e8d05f7c1a81ee44e582ce3bb3b64cbb8 not found: ID does not exist" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.116555 4675 scope.go:117] "RemoveContainer" containerID="958d71da673d02ed4c067f390a0a94c9f20052f6274ec975f9a4c3034efa7aaa" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.117138 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"958d71da673d02ed4c067f390a0a94c9f20052f6274ec975f9a4c3034efa7aaa"} err="failed to get container status \"958d71da673d02ed4c067f390a0a94c9f20052f6274ec975f9a4c3034efa7aaa\": rpc error: code = NotFound desc = could not find container \"958d71da673d02ed4c067f390a0a94c9f20052f6274ec975f9a4c3034efa7aaa\": container with ID starting with 958d71da673d02ed4c067f390a0a94c9f20052f6274ec975f9a4c3034efa7aaa not found: ID does not exist" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.117184 4675 scope.go:117] "RemoveContainer" containerID="9296772284fdb3e0ac56ffd5e54f10e7fe1256ccb616d52f103479ebcc6e81f9" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.117529 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9296772284fdb3e0ac56ffd5e54f10e7fe1256ccb616d52f103479ebcc6e81f9"} err="failed to get container status \"9296772284fdb3e0ac56ffd5e54f10e7fe1256ccb616d52f103479ebcc6e81f9\": rpc error: code = NotFound desc = could not find container \"9296772284fdb3e0ac56ffd5e54f10e7fe1256ccb616d52f103479ebcc6e81f9\": container with ID starting with 9296772284fdb3e0ac56ffd5e54f10e7fe1256ccb616d52f103479ebcc6e81f9 not found: ID does not exist" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.117561 4675 scope.go:117] "RemoveContainer" containerID="ace6ee814e8e429b6ec77e88a0bb2fff35cfbfd3497e7a4d7882e11f437ce5a4" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.117879 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ace6ee814e8e429b6ec77e88a0bb2fff35cfbfd3497e7a4d7882e11f437ce5a4"} err="failed to get container status \"ace6ee814e8e429b6ec77e88a0bb2fff35cfbfd3497e7a4d7882e11f437ce5a4\": rpc error: code = NotFound desc = could not find container \"ace6ee814e8e429b6ec77e88a0bb2fff35cfbfd3497e7a4d7882e11f437ce5a4\": container with ID starting with ace6ee814e8e429b6ec77e88a0bb2fff35cfbfd3497e7a4d7882e11f437ce5a4 not found: ID does not exist" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.117927 4675 scope.go:117] "RemoveContainer" containerID="87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.118246 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5"} err="failed to get container status \"87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5\": rpc error: code = NotFound desc = could not find container \"87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5\": container with ID starting with 87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5 not found: ID does not exist" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.118278 4675 scope.go:117] "RemoveContainer" containerID="73209f6cc6e17a7237c040c1bc086b40bb5945de0483d1a3082546f8d16c26b1" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.118607 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73209f6cc6e17a7237c040c1bc086b40bb5945de0483d1a3082546f8d16c26b1"} err="failed to get container status \"73209f6cc6e17a7237c040c1bc086b40bb5945de0483d1a3082546f8d16c26b1\": rpc error: code = NotFound desc = could not find container \"73209f6cc6e17a7237c040c1bc086b40bb5945de0483d1a3082546f8d16c26b1\": container with ID starting with 73209f6cc6e17a7237c040c1bc086b40bb5945de0483d1a3082546f8d16c26b1 not found: ID does not exist" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.118634 4675 scope.go:117] "RemoveContainer" containerID="07781eeef14d0ee02a533c5f1b5b7bdb9fa735d2a02636e9671605e69e6a932c" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.119041 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07781eeef14d0ee02a533c5f1b5b7bdb9fa735d2a02636e9671605e69e6a932c"} err="failed to get container status \"07781eeef14d0ee02a533c5f1b5b7bdb9fa735d2a02636e9671605e69e6a932c\": rpc error: code = NotFound desc = could not find container \"07781eeef14d0ee02a533c5f1b5b7bdb9fa735d2a02636e9671605e69e6a932c\": container with ID starting with 07781eeef14d0ee02a533c5f1b5b7bdb9fa735d2a02636e9671605e69e6a932c not found: ID does not exist" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.119085 4675 scope.go:117] "RemoveContainer" containerID="d4a449e69885bb506900d7dbb14ee3ff9eb2d1188267babb04fbe0bb8f057c48" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.119421 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4a449e69885bb506900d7dbb14ee3ff9eb2d1188267babb04fbe0bb8f057c48"} err="failed to get container status \"d4a449e69885bb506900d7dbb14ee3ff9eb2d1188267babb04fbe0bb8f057c48\": rpc error: code = NotFound desc = could not find container \"d4a449e69885bb506900d7dbb14ee3ff9eb2d1188267babb04fbe0bb8f057c48\": container with ID starting with d4a449e69885bb506900d7dbb14ee3ff9eb2d1188267babb04fbe0bb8f057c48 not found: ID does not exist" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.119458 4675 scope.go:117] "RemoveContainer" containerID="13af49ba7abbe43edde46130d212f6d76f30bab85a63aef18a84721887e29535" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.120031 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13af49ba7abbe43edde46130d212f6d76f30bab85a63aef18a84721887e29535"} err="failed to get container status \"13af49ba7abbe43edde46130d212f6d76f30bab85a63aef18a84721887e29535\": rpc error: code = NotFound desc = could not find container \"13af49ba7abbe43edde46130d212f6d76f30bab85a63aef18a84721887e29535\": container with ID starting with 13af49ba7abbe43edde46130d212f6d76f30bab85a63aef18a84721887e29535 not found: ID does not exist" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.120059 4675 scope.go:117] "RemoveContainer" containerID="1e22795f3990f90b901e1ba026e5f95e72508dc865635d19ba881a291dbaed85" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.120388 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e22795f3990f90b901e1ba026e5f95e72508dc865635d19ba881a291dbaed85"} err="failed to get container status \"1e22795f3990f90b901e1ba026e5f95e72508dc865635d19ba881a291dbaed85\": rpc error: code = NotFound desc = could not find container \"1e22795f3990f90b901e1ba026e5f95e72508dc865635d19ba881a291dbaed85\": container with ID starting with 1e22795f3990f90b901e1ba026e5f95e72508dc865635d19ba881a291dbaed85 not found: ID does not exist" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.120416 4675 scope.go:117] "RemoveContainer" containerID="5282559b57497e15c0bd828702b4994e8d05f7c1a81ee44e582ce3bb3b64cbb8" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.120717 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5282559b57497e15c0bd828702b4994e8d05f7c1a81ee44e582ce3bb3b64cbb8"} err="failed to get container status \"5282559b57497e15c0bd828702b4994e8d05f7c1a81ee44e582ce3bb3b64cbb8\": rpc error: code = NotFound desc = could not find container \"5282559b57497e15c0bd828702b4994e8d05f7c1a81ee44e582ce3bb3b64cbb8\": container with ID starting with 5282559b57497e15c0bd828702b4994e8d05f7c1a81ee44e582ce3bb3b64cbb8 not found: ID does not exist" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.120746 4675 scope.go:117] "RemoveContainer" containerID="958d71da673d02ed4c067f390a0a94c9f20052f6274ec975f9a4c3034efa7aaa" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.121153 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"958d71da673d02ed4c067f390a0a94c9f20052f6274ec975f9a4c3034efa7aaa"} err="failed to get container status \"958d71da673d02ed4c067f390a0a94c9f20052f6274ec975f9a4c3034efa7aaa\": rpc error: code = NotFound desc = could not find container \"958d71da673d02ed4c067f390a0a94c9f20052f6274ec975f9a4c3034efa7aaa\": container with ID starting with 958d71da673d02ed4c067f390a0a94c9f20052f6274ec975f9a4c3034efa7aaa not found: ID does not exist" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.121196 4675 scope.go:117] "RemoveContainer" containerID="9296772284fdb3e0ac56ffd5e54f10e7fe1256ccb616d52f103479ebcc6e81f9" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.121485 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9296772284fdb3e0ac56ffd5e54f10e7fe1256ccb616d52f103479ebcc6e81f9"} err="failed to get container status \"9296772284fdb3e0ac56ffd5e54f10e7fe1256ccb616d52f103479ebcc6e81f9\": rpc error: code = NotFound desc = could not find container \"9296772284fdb3e0ac56ffd5e54f10e7fe1256ccb616d52f103479ebcc6e81f9\": container with ID starting with 9296772284fdb3e0ac56ffd5e54f10e7fe1256ccb616d52f103479ebcc6e81f9 not found: ID does not exist" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.121519 4675 scope.go:117] "RemoveContainer" containerID="ace6ee814e8e429b6ec77e88a0bb2fff35cfbfd3497e7a4d7882e11f437ce5a4" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.121809 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ace6ee814e8e429b6ec77e88a0bb2fff35cfbfd3497e7a4d7882e11f437ce5a4"} err="failed to get container status \"ace6ee814e8e429b6ec77e88a0bb2fff35cfbfd3497e7a4d7882e11f437ce5a4\": rpc error: code = NotFound desc = could not find container \"ace6ee814e8e429b6ec77e88a0bb2fff35cfbfd3497e7a4d7882e11f437ce5a4\": container with ID starting with ace6ee814e8e429b6ec77e88a0bb2fff35cfbfd3497e7a4d7882e11f437ce5a4 not found: ID does not exist" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.121843 4675 scope.go:117] "RemoveContainer" containerID="87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.122147 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5"} err="failed to get container status \"87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5\": rpc error: code = NotFound desc = could not find container \"87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5\": container with ID starting with 87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5 not found: ID does not exist" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.122176 4675 scope.go:117] "RemoveContainer" containerID="73209f6cc6e17a7237c040c1bc086b40bb5945de0483d1a3082546f8d16c26b1" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.122537 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73209f6cc6e17a7237c040c1bc086b40bb5945de0483d1a3082546f8d16c26b1"} err="failed to get container status \"73209f6cc6e17a7237c040c1bc086b40bb5945de0483d1a3082546f8d16c26b1\": rpc error: code = NotFound desc = could not find container \"73209f6cc6e17a7237c040c1bc086b40bb5945de0483d1a3082546f8d16c26b1\": container with ID starting with 73209f6cc6e17a7237c040c1bc086b40bb5945de0483d1a3082546f8d16c26b1 not found: ID does not exist" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.122558 4675 scope.go:117] "RemoveContainer" containerID="07781eeef14d0ee02a533c5f1b5b7bdb9fa735d2a02636e9671605e69e6a932c" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.122965 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07781eeef14d0ee02a533c5f1b5b7bdb9fa735d2a02636e9671605e69e6a932c"} err="failed to get container status \"07781eeef14d0ee02a533c5f1b5b7bdb9fa735d2a02636e9671605e69e6a932c\": rpc error: code = NotFound desc = could not find container \"07781eeef14d0ee02a533c5f1b5b7bdb9fa735d2a02636e9671605e69e6a932c\": container with ID starting with 07781eeef14d0ee02a533c5f1b5b7bdb9fa735d2a02636e9671605e69e6a932c not found: ID does not exist" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.123055 4675 scope.go:117] "RemoveContainer" containerID="d4a449e69885bb506900d7dbb14ee3ff9eb2d1188267babb04fbe0bb8f057c48" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.123433 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4a449e69885bb506900d7dbb14ee3ff9eb2d1188267babb04fbe0bb8f057c48"} err="failed to get container status \"d4a449e69885bb506900d7dbb14ee3ff9eb2d1188267babb04fbe0bb8f057c48\": rpc error: code = NotFound desc = could not find container \"d4a449e69885bb506900d7dbb14ee3ff9eb2d1188267babb04fbe0bb8f057c48\": container with ID starting with d4a449e69885bb506900d7dbb14ee3ff9eb2d1188267babb04fbe0bb8f057c48 not found: ID does not exist" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.123482 4675 scope.go:117] "RemoveContainer" containerID="13af49ba7abbe43edde46130d212f6d76f30bab85a63aef18a84721887e29535" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.123845 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13af49ba7abbe43edde46130d212f6d76f30bab85a63aef18a84721887e29535"} err="failed to get container status \"13af49ba7abbe43edde46130d212f6d76f30bab85a63aef18a84721887e29535\": rpc error: code = NotFound desc = could not find container \"13af49ba7abbe43edde46130d212f6d76f30bab85a63aef18a84721887e29535\": container with ID starting with 13af49ba7abbe43edde46130d212f6d76f30bab85a63aef18a84721887e29535 not found: ID does not exist" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.123896 4675 scope.go:117] "RemoveContainer" containerID="1e22795f3990f90b901e1ba026e5f95e72508dc865635d19ba881a291dbaed85" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.124371 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e22795f3990f90b901e1ba026e5f95e72508dc865635d19ba881a291dbaed85"} err="failed to get container status \"1e22795f3990f90b901e1ba026e5f95e72508dc865635d19ba881a291dbaed85\": rpc error: code = NotFound desc = could not find container \"1e22795f3990f90b901e1ba026e5f95e72508dc865635d19ba881a291dbaed85\": container with ID starting with 1e22795f3990f90b901e1ba026e5f95e72508dc865635d19ba881a291dbaed85 not found: ID does not exist" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.124414 4675 scope.go:117] "RemoveContainer" containerID="5282559b57497e15c0bd828702b4994e8d05f7c1a81ee44e582ce3bb3b64cbb8" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.124782 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5282559b57497e15c0bd828702b4994e8d05f7c1a81ee44e582ce3bb3b64cbb8"} err="failed to get container status \"5282559b57497e15c0bd828702b4994e8d05f7c1a81ee44e582ce3bb3b64cbb8\": rpc error: code = NotFound desc = could not find container \"5282559b57497e15c0bd828702b4994e8d05f7c1a81ee44e582ce3bb3b64cbb8\": container with ID starting with 5282559b57497e15c0bd828702b4994e8d05f7c1a81ee44e582ce3bb3b64cbb8 not found: ID does not exist" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.124850 4675 scope.go:117] "RemoveContainer" containerID="958d71da673d02ed4c067f390a0a94c9f20052f6274ec975f9a4c3034efa7aaa" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.125171 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"958d71da673d02ed4c067f390a0a94c9f20052f6274ec975f9a4c3034efa7aaa"} err="failed to get container status \"958d71da673d02ed4c067f390a0a94c9f20052f6274ec975f9a4c3034efa7aaa\": rpc error: code = NotFound desc = could not find container \"958d71da673d02ed4c067f390a0a94c9f20052f6274ec975f9a4c3034efa7aaa\": container with ID starting with 958d71da673d02ed4c067f390a0a94c9f20052f6274ec975f9a4c3034efa7aaa not found: ID does not exist" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.125198 4675 scope.go:117] "RemoveContainer" containerID="9296772284fdb3e0ac56ffd5e54f10e7fe1256ccb616d52f103479ebcc6e81f9" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.125574 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9296772284fdb3e0ac56ffd5e54f10e7fe1256ccb616d52f103479ebcc6e81f9"} err="failed to get container status \"9296772284fdb3e0ac56ffd5e54f10e7fe1256ccb616d52f103479ebcc6e81f9\": rpc error: code = NotFound desc = could not find container \"9296772284fdb3e0ac56ffd5e54f10e7fe1256ccb616d52f103479ebcc6e81f9\": container with ID starting with 9296772284fdb3e0ac56ffd5e54f10e7fe1256ccb616d52f103479ebcc6e81f9 not found: ID does not exist" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.125614 4675 scope.go:117] "RemoveContainer" containerID="ace6ee814e8e429b6ec77e88a0bb2fff35cfbfd3497e7a4d7882e11f437ce5a4" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.125960 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ace6ee814e8e429b6ec77e88a0bb2fff35cfbfd3497e7a4d7882e11f437ce5a4"} err="failed to get container status \"ace6ee814e8e429b6ec77e88a0bb2fff35cfbfd3497e7a4d7882e11f437ce5a4\": rpc error: code = NotFound desc = could not find container \"ace6ee814e8e429b6ec77e88a0bb2fff35cfbfd3497e7a4d7882e11f437ce5a4\": container with ID starting with ace6ee814e8e429b6ec77e88a0bb2fff35cfbfd3497e7a4d7882e11f437ce5a4 not found: ID does not exist" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.126000 4675 scope.go:117] "RemoveContainer" containerID="87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.126271 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5"} err="failed to get container status \"87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5\": rpc error: code = NotFound desc = could not find container \"87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5\": container with ID starting with 87374d32a3ee86c4576db098f0cab8d93c90d39a07cd2ab625693121e3b26ca5 not found: ID does not exist" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.126297 4675 scope.go:117] "RemoveContainer" containerID="73209f6cc6e17a7237c040c1bc086b40bb5945de0483d1a3082546f8d16c26b1" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.126612 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73209f6cc6e17a7237c040c1bc086b40bb5945de0483d1a3082546f8d16c26b1"} err="failed to get container status \"73209f6cc6e17a7237c040c1bc086b40bb5945de0483d1a3082546f8d16c26b1\": rpc error: code = NotFound desc = could not find container \"73209f6cc6e17a7237c040c1bc086b40bb5945de0483d1a3082546f8d16c26b1\": container with ID starting with 73209f6cc6e17a7237c040c1bc086b40bb5945de0483d1a3082546f8d16c26b1 not found: ID does not exist" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.179441 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mgsc6" Feb 16 19:51:49 crc kubenswrapper[4675]: W0216 19:51:49.203287 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3b433a4_870c_4bb8_b588_614bbec134cd.slice/crio-6cf7c22c1615f81448a3a54f0f86cab281d0b6c5edf4f5bf4f299fe7751a23de WatchSource:0}: Error finding container 6cf7c22c1615f81448a3a54f0f86cab281d0b6c5edf4f5bf4f299fe7751a23de: Status 404 returned error can't find the container with id 6cf7c22c1615f81448a3a54f0f86cab281d0b6c5edf4f5bf4f299fe7751a23de Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.221642 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-gpc5z"] Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.230733 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-gpc5z"] Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.889303 4675 generic.go:334] "Generic (PLEG): container finished" podID="e3b433a4-870c-4bb8-b588-614bbec134cd" containerID="745b988c059ba2470e84bb43d609841b168b82f338bcd13949d275115499ce73" exitCode=0 Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.908419 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b6e2d5a-0472-425b-b5b4-0b94f14ebfba" path="/var/lib/kubelet/pods/9b6e2d5a-0472-425b-b5b4-0b94f14ebfba/volumes" Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.909643 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgsc6" event={"ID":"e3b433a4-870c-4bb8-b588-614bbec134cd","Type":"ContainerDied","Data":"745b988c059ba2470e84bb43d609841b168b82f338bcd13949d275115499ce73"} Feb 16 19:51:49 crc kubenswrapper[4675]: I0216 19:51:49.909703 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgsc6" event={"ID":"e3b433a4-870c-4bb8-b588-614bbec134cd","Type":"ContainerStarted","Data":"6cf7c22c1615f81448a3a54f0f86cab281d0b6c5edf4f5bf4f299fe7751a23de"} Feb 16 19:51:50 crc kubenswrapper[4675]: I0216 19:51:50.904333 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgsc6" event={"ID":"e3b433a4-870c-4bb8-b588-614bbec134cd","Type":"ContainerStarted","Data":"2cadc963cb70932d41635ea7a3da5a649b19c14f2543569f3d6a090185af4290"} Feb 16 19:51:50 crc kubenswrapper[4675]: I0216 19:51:50.904438 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgsc6" event={"ID":"e3b433a4-870c-4bb8-b588-614bbec134cd","Type":"ContainerStarted","Data":"77870958c89b39e8d7344fb64abc393fa63f97203d7428f9d273a0c02ea9ec83"} Feb 16 19:51:50 crc kubenswrapper[4675]: I0216 19:51:50.904485 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgsc6" event={"ID":"e3b433a4-870c-4bb8-b588-614bbec134cd","Type":"ContainerStarted","Data":"10e2e9caba780c3cb272edc70e4467ba9d0cd51a936b2ac774efc520be7ef71b"} Feb 16 19:51:50 crc kubenswrapper[4675]: I0216 19:51:50.904499 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgsc6" event={"ID":"e3b433a4-870c-4bb8-b588-614bbec134cd","Type":"ContainerStarted","Data":"f1e82d81651c4708bd6ec64eac383ad463c79f8068ed7f79ed6ef71dd7bedd0a"} Feb 16 19:51:50 crc kubenswrapper[4675]: I0216 19:51:50.904516 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgsc6" event={"ID":"e3b433a4-870c-4bb8-b588-614bbec134cd","Type":"ContainerStarted","Data":"bf01237b4974113969b0392401deb4fb685710eb8e0ee6b637742179b6b0a09c"} Feb 16 19:51:51 crc kubenswrapper[4675]: I0216 19:51:51.916132 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgsc6" event={"ID":"e3b433a4-870c-4bb8-b588-614bbec134cd","Type":"ContainerStarted","Data":"e6901731c6c6ae9ae8ab0b33818b4574ddcc6127eb277c31062ba6c9754a0a8e"} Feb 16 19:51:53 crc kubenswrapper[4675]: I0216 19:51:53.938367 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgsc6" event={"ID":"e3b433a4-870c-4bb8-b588-614bbec134cd","Type":"ContainerStarted","Data":"b939379e8923c950015fd4ae1de4bcd379374c1f3f88b02d55f186a314147c0b"} Feb 16 19:51:55 crc kubenswrapper[4675]: I0216 19:51:55.960275 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgsc6" event={"ID":"e3b433a4-870c-4bb8-b588-614bbec134cd","Type":"ContainerStarted","Data":"0d75fce3e74b83ad6bf1d6c9bb3ea8aaf4b22b35745b58b0976b528695fef747"} Feb 16 19:51:55 crc kubenswrapper[4675]: I0216 19:51:55.961312 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mgsc6" Feb 16 19:51:55 crc kubenswrapper[4675]: I0216 19:51:55.997590 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-mgsc6" podStartSLOduration=7.997569005 podStartE2EDuration="7.997569005s" podCreationTimestamp="2026-02-16 19:51:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 19:51:55.996262711 +0000 UTC m=+599.121552277" watchObservedRunningTime="2026-02-16 19:51:55.997569005 +0000 UTC m=+599.122858571" Feb 16 19:51:56 crc kubenswrapper[4675]: I0216 19:51:56.013040 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mgsc6" Feb 16 19:51:56 crc kubenswrapper[4675]: I0216 19:51:56.968235 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mgsc6" Feb 16 19:51:56 crc kubenswrapper[4675]: I0216 19:51:56.968293 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mgsc6" Feb 16 19:51:57 crc kubenswrapper[4675]: I0216 19:51:57.005174 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mgsc6" Feb 16 19:51:58 crc kubenswrapper[4675]: I0216 19:51:58.102653 4675 scope.go:117] "RemoveContainer" containerID="4e43206e68596f3aac0d8c5d9c093d1b5aa57306577c54fc003ab051950ceeac" Feb 16 19:51:58 crc kubenswrapper[4675]: I0216 19:51:58.983215 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pj5xg_c9a99563-d631-455f-8464-160e5619c610/kube-multus/2.log" Feb 16 19:52:03 crc kubenswrapper[4675]: I0216 19:52:03.884274 4675 scope.go:117] "RemoveContainer" containerID="d0d4149182b358057857443c048c0c9d3c148645a2efd00ff51712f6b4d3fc01" Feb 16 19:52:03 crc kubenswrapper[4675]: E0216 19:52:03.885361 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-pj5xg_openshift-multus(c9a99563-d631-455f-8464-160e5619c610)\"" pod="openshift-multus/multus-pj5xg" podUID="c9a99563-d631-455f-8464-160e5619c610" Feb 16 19:52:10 crc kubenswrapper[4675]: I0216 19:52:10.768466 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph"] Feb 16 19:52:10 crc kubenswrapper[4675]: I0216 19:52:10.770484 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph" Feb 16 19:52:10 crc kubenswrapper[4675]: I0216 19:52:10.774609 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-4dk8g" Feb 16 19:52:10 crc kubenswrapper[4675]: I0216 19:52:10.774675 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Feb 16 19:52:10 crc kubenswrapper[4675]: I0216 19:52:10.775888 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Feb 16 19:52:10 crc kubenswrapper[4675]: I0216 19:52:10.872809 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log\" (UniqueName: \"kubernetes.io/empty-dir/643b8119-31f7-4402-8fd2-c6956598205b-log\") pod \"ceph\" (UID: \"643b8119-31f7-4402-8fd2-c6956598205b\") " pod="openstack/ceph" Feb 16 19:52:10 crc kubenswrapper[4675]: I0216 19:52:10.872906 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/empty-dir/643b8119-31f7-4402-8fd2-c6956598205b-run\") pod \"ceph\" (UID: \"643b8119-31f7-4402-8fd2-c6956598205b\") " pod="openstack/ceph" Feb 16 19:52:10 crc kubenswrapper[4675]: I0216 19:52:10.873019 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hc8l4\" (UniqueName: \"kubernetes.io/projected/643b8119-31f7-4402-8fd2-c6956598205b-kube-api-access-hc8l4\") pod \"ceph\" (UID: \"643b8119-31f7-4402-8fd2-c6956598205b\") " pod="openstack/ceph" Feb 16 19:52:10 crc kubenswrapper[4675]: I0216 19:52:10.873214 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/643b8119-31f7-4402-8fd2-c6956598205b-data\") pod \"ceph\" (UID: \"643b8119-31f7-4402-8fd2-c6956598205b\") " pod="openstack/ceph" Feb 16 19:52:10 crc kubenswrapper[4675]: I0216 19:52:10.975218 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hc8l4\" (UniqueName: \"kubernetes.io/projected/643b8119-31f7-4402-8fd2-c6956598205b-kube-api-access-hc8l4\") pod \"ceph\" (UID: \"643b8119-31f7-4402-8fd2-c6956598205b\") " pod="openstack/ceph" Feb 16 19:52:10 crc kubenswrapper[4675]: I0216 19:52:10.975387 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/643b8119-31f7-4402-8fd2-c6956598205b-data\") pod \"ceph\" (UID: \"643b8119-31f7-4402-8fd2-c6956598205b\") " pod="openstack/ceph" Feb 16 19:52:10 crc kubenswrapper[4675]: I0216 19:52:10.975559 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log\" (UniqueName: \"kubernetes.io/empty-dir/643b8119-31f7-4402-8fd2-c6956598205b-log\") pod \"ceph\" (UID: \"643b8119-31f7-4402-8fd2-c6956598205b\") " pod="openstack/ceph" Feb 16 19:52:10 crc kubenswrapper[4675]: I0216 19:52:10.975625 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/empty-dir/643b8119-31f7-4402-8fd2-c6956598205b-run\") pod \"ceph\" (UID: \"643b8119-31f7-4402-8fd2-c6956598205b\") " pod="openstack/ceph" Feb 16 19:52:10 crc kubenswrapper[4675]: I0216 19:52:10.976216 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/643b8119-31f7-4402-8fd2-c6956598205b-data\") pod \"ceph\" (UID: \"643b8119-31f7-4402-8fd2-c6956598205b\") " pod="openstack/ceph" Feb 16 19:52:10 crc kubenswrapper[4675]: I0216 19:52:10.976527 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log\" (UniqueName: \"kubernetes.io/empty-dir/643b8119-31f7-4402-8fd2-c6956598205b-log\") pod \"ceph\" (UID: \"643b8119-31f7-4402-8fd2-c6956598205b\") " pod="openstack/ceph" Feb 16 19:52:10 crc kubenswrapper[4675]: I0216 19:52:10.976811 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/empty-dir/643b8119-31f7-4402-8fd2-c6956598205b-run\") pod \"ceph\" (UID: \"643b8119-31f7-4402-8fd2-c6956598205b\") " pod="openstack/ceph" Feb 16 19:52:11 crc kubenswrapper[4675]: I0216 19:52:11.011743 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hc8l4\" (UniqueName: \"kubernetes.io/projected/643b8119-31f7-4402-8fd2-c6956598205b-kube-api-access-hc8l4\") pod \"ceph\" (UID: \"643b8119-31f7-4402-8fd2-c6956598205b\") " pod="openstack/ceph" Feb 16 19:52:11 crc kubenswrapper[4675]: I0216 19:52:11.102746 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph" Feb 16 19:52:11 crc kubenswrapper[4675]: W0216 19:52:11.130039 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod643b8119_31f7_4402_8fd2_c6956598205b.slice/crio-9e17dcb88f2d6a81169abe671c1590d4cc004822275f1e35bb07e2cac4fd7a07 WatchSource:0}: Error finding container 9e17dcb88f2d6a81169abe671c1590d4cc004822275f1e35bb07e2cac4fd7a07: Status 404 returned error can't find the container with id 9e17dcb88f2d6a81169abe671c1590d4cc004822275f1e35bb07e2cac4fd7a07 Feb 16 19:52:11 crc kubenswrapper[4675]: E0216 19:52:11.166050 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:52:11 crc kubenswrapper[4675]: E0216 19:52:11.181219 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:52:12 crc kubenswrapper[4675]: I0216 19:52:12.070276 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph" event={"ID":"643b8119-31f7-4402-8fd2-c6956598205b","Type":"ContainerStarted","Data":"9e17dcb88f2d6a81169abe671c1590d4cc004822275f1e35bb07e2cac4fd7a07"} Feb 16 19:52:12 crc kubenswrapper[4675]: E0216 19:52:12.357542 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:52:12 crc kubenswrapper[4675]: E0216 19:52:12.376155 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:52:13 crc kubenswrapper[4675]: E0216 19:52:13.556389 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:52:13 crc kubenswrapper[4675]: E0216 19:52:13.572076 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:52:14 crc kubenswrapper[4675]: E0216 19:52:14.743229 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:52:14 crc kubenswrapper[4675]: E0216 19:52:14.760649 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:52:15 crc kubenswrapper[4675]: I0216 19:52:15.884543 4675 scope.go:117] "RemoveContainer" containerID="d0d4149182b358057857443c048c0c9d3c148645a2efd00ff51712f6b4d3fc01" Feb 16 19:52:16 crc kubenswrapper[4675]: E0216 19:52:16.012855 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:52:16 crc kubenswrapper[4675]: E0216 19:52:16.030474 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:52:16 crc kubenswrapper[4675]: I0216 19:52:16.104060 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pj5xg_c9a99563-d631-455f-8464-160e5619c610/kube-multus/2.log" Feb 16 19:52:16 crc kubenswrapper[4675]: I0216 19:52:16.104130 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pj5xg" event={"ID":"c9a99563-d631-455f-8464-160e5619c610","Type":"ContainerStarted","Data":"1519949e9f59b87728c7462011b871109b98fce90d6b40df78e728a84b276f00"} Feb 16 19:52:17 crc kubenswrapper[4675]: E0216 19:52:17.206670 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:52:17 crc kubenswrapper[4675]: E0216 19:52:17.220481 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:52:18 crc kubenswrapper[4675]: E0216 19:52:18.406227 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:52:18 crc kubenswrapper[4675]: E0216 19:52:18.424311 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:52:19 crc kubenswrapper[4675]: I0216 19:52:19.215514 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mgsc6" Feb 16 19:52:19 crc kubenswrapper[4675]: E0216 19:52:19.596619 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:52:19 crc kubenswrapper[4675]: E0216 19:52:19.615435 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:52:20 crc kubenswrapper[4675]: E0216 19:52:20.811343 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:52:20 crc kubenswrapper[4675]: E0216 19:52:20.828199 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:52:21 crc kubenswrapper[4675]: E0216 19:52:21.989229 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:52:22 crc kubenswrapper[4675]: E0216 19:52:22.009724 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:52:23 crc kubenswrapper[4675]: E0216 19:52:23.242357 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:52:23 crc kubenswrapper[4675]: E0216 19:52:23.257887 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:52:24 crc kubenswrapper[4675]: E0216 19:52:24.459396 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:52:24 crc kubenswrapper[4675]: E0216 19:52:24.476819 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:52:25 crc kubenswrapper[4675]: E0216 19:52:25.652210 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:52:25 crc kubenswrapper[4675]: E0216 19:52:25.670325 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:52:26 crc kubenswrapper[4675]: E0216 19:52:26.874265 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:52:26 crc kubenswrapper[4675]: E0216 19:52:26.889050 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:52:28 crc kubenswrapper[4675]: E0216 19:52:28.103367 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:52:28 crc kubenswrapper[4675]: E0216 19:52:28.123641 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:52:29 crc kubenswrapper[4675]: E0216 19:52:29.356935 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:52:29 crc kubenswrapper[4675]: E0216 19:52:29.372560 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:52:30 crc kubenswrapper[4675]: E0216 19:52:30.334252 4675 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/ceph/demo:latest-squid" Feb 16 19:52:30 crc kubenswrapper[4675]: E0216 19:52:30.334497 4675 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceph,Image:quay.io/ceph/demo:latest-squid,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:MON_IP,Value:192.168.126.11,ValueFrom:nil,},EnvVar{Name:CEPH_DAEMON,Value:demo,ValueFrom:nil,},EnvVar{Name:CEPH_PUBLIC_NETWORK,Value:0.0.0.0/0,ValueFrom:nil,},EnvVar{Name:DEMO_DAEMONS,Value:osd,mds,rgw,ValueFrom:nil,},EnvVar{Name:CEPH_DEMO_UID,Value:0,ValueFrom:nil,},EnvVar{Name:RGW_NAME,Value:ceph,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:data,ReadOnly:false,MountPath:/var/lib/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:log,ReadOnly:false,MountPath:/var/log/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:run,ReadOnly:false,MountPath:/run/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hc8l4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceph_openstack(643b8119-31f7-4402-8fd2-c6956598205b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 16 19:52:30 crc kubenswrapper[4675]: E0216 19:52:30.335659 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceph\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceph" podUID="643b8119-31f7-4402-8fd2-c6956598205b" Feb 16 19:52:30 crc kubenswrapper[4675]: E0216 19:52:30.591506 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:52:30 crc kubenswrapper[4675]: E0216 19:52:30.619083 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:52:31 crc kubenswrapper[4675]: E0216 19:52:31.203600 4675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceph\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/ceph/demo:latest-squid\\\"\"" pod="openstack/ceph" podUID="643b8119-31f7-4402-8fd2-c6956598205b" Feb 16 19:52:31 crc kubenswrapper[4675]: E0216 19:52:31.797954 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:52:31 crc kubenswrapper[4675]: E0216 19:52:31.814238 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:52:33 crc kubenswrapper[4675]: E0216 19:52:33.057148 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:52:33 crc kubenswrapper[4675]: E0216 19:52:33.078726 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:52:34 crc kubenswrapper[4675]: E0216 19:52:34.273595 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:52:34 crc kubenswrapper[4675]: E0216 19:52:34.296416 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:52:35 crc kubenswrapper[4675]: E0216 19:52:35.492780 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:52:35 crc kubenswrapper[4675]: E0216 19:52:35.514409 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:52:36 crc kubenswrapper[4675]: E0216 19:52:36.707643 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:52:36 crc kubenswrapper[4675]: E0216 19:52:36.730602 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:52:37 crc kubenswrapper[4675]: E0216 19:52:37.935797 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:52:37 crc kubenswrapper[4675]: E0216 19:52:37.956426 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:52:39 crc kubenswrapper[4675]: E0216 19:52:39.191741 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:52:39 crc kubenswrapper[4675]: E0216 19:52:39.211544 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:52:40 crc kubenswrapper[4675]: E0216 19:52:40.436104 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:52:40 crc kubenswrapper[4675]: E0216 19:52:40.456533 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:52:41 crc kubenswrapper[4675]: E0216 19:52:41.680839 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:52:41 crc kubenswrapper[4675]: E0216 19:52:41.701531 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:52:42 crc kubenswrapper[4675]: E0216 19:52:42.883766 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:52:42 crc kubenswrapper[4675]: E0216 19:52:42.911932 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:52:44 crc kubenswrapper[4675]: E0216 19:52:44.128756 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:52:44 crc kubenswrapper[4675]: E0216 19:52:44.147593 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:52:44 crc kubenswrapper[4675]: I0216 19:52:44.314542 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph" event={"ID":"643b8119-31f7-4402-8fd2-c6956598205b","Type":"ContainerStarted","Data":"360be6246c977b98f0ec24d9027e99bc58618d96e48d07c54d89318b446a6425"} Feb 16 19:52:44 crc kubenswrapper[4675]: I0216 19:52:44.338190 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph" podStartSLOduration=2.10243996 podStartE2EDuration="34.33816738s" podCreationTimestamp="2026-02-16 19:52:10 +0000 UTC" firstStartedPulling="2026-02-16 19:52:11.133055577 +0000 UTC m=+614.258345133" lastFinishedPulling="2026-02-16 19:52:43.368782967 +0000 UTC m=+646.494072553" observedRunningTime="2026-02-16 19:52:44.333871757 +0000 UTC m=+647.459161343" watchObservedRunningTime="2026-02-16 19:52:44.33816738 +0000 UTC m=+647.463456956" Feb 16 19:52:45 crc kubenswrapper[4675]: E0216 19:52:45.379991 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:52:45 crc kubenswrapper[4675]: E0216 19:52:45.404176 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:52:46 crc kubenswrapper[4675]: E0216 19:52:46.639810 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:52:46 crc kubenswrapper[4675]: E0216 19:52:46.658040 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:52:47 crc kubenswrapper[4675]: E0216 19:52:47.876361 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:52:47 crc kubenswrapper[4675]: E0216 19:52:47.896946 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:52:49 crc kubenswrapper[4675]: E0216 19:52:49.087550 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:52:49 crc kubenswrapper[4675]: E0216 19:52:49.113249 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:52:50 crc kubenswrapper[4675]: E0216 19:52:50.321453 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:52:50 crc kubenswrapper[4675]: E0216 19:52:50.341459 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:52:51 crc kubenswrapper[4675]: E0216 19:52:51.542386 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:52:51 crc kubenswrapper[4675]: E0216 19:52:51.563539 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:52:52 crc kubenswrapper[4675]: E0216 19:52:52.818068 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:52:52 crc kubenswrapper[4675]: E0216 19:52:52.845217 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:52:54 crc kubenswrapper[4675]: E0216 19:52:54.026749 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:52:54 crc kubenswrapper[4675]: E0216 19:52:54.045102 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:52:55 crc kubenswrapper[4675]: E0216 19:52:55.235726 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:52:55 crc kubenswrapper[4675]: E0216 19:52:55.257825 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:52:56 crc kubenswrapper[4675]: E0216 19:52:56.432435 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:52:56 crc kubenswrapper[4675]: E0216 19:52:56.448823 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:52:57 crc kubenswrapper[4675]: E0216 19:52:57.625141 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:52:57 crc kubenswrapper[4675]: E0216 19:52:57.649920 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:52:58 crc kubenswrapper[4675]: E0216 19:52:58.870771 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:52:58 crc kubenswrapper[4675]: E0216 19:52:58.903922 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:00 crc kubenswrapper[4675]: E0216 19:53:00.100663 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:00 crc kubenswrapper[4675]: E0216 19:53:00.117910 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:01 crc kubenswrapper[4675]: E0216 19:53:01.273384 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:01 crc kubenswrapper[4675]: E0216 19:53:01.291765 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:02 crc kubenswrapper[4675]: E0216 19:53:02.501942 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:02 crc kubenswrapper[4675]: E0216 19:53:02.522647 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:03 crc kubenswrapper[4675]: E0216 19:53:03.756547 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:03 crc kubenswrapper[4675]: E0216 19:53:03.780061 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:04 crc kubenswrapper[4675]: E0216 19:53:04.999861 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:05 crc kubenswrapper[4675]: E0216 19:53:05.017853 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:06 crc kubenswrapper[4675]: E0216 19:53:06.165645 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:06 crc kubenswrapper[4675]: E0216 19:53:06.181868 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:07 crc kubenswrapper[4675]: E0216 19:53:07.361047 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:07 crc kubenswrapper[4675]: E0216 19:53:07.377732 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:08 crc kubenswrapper[4675]: E0216 19:53:08.554797 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:08 crc kubenswrapper[4675]: E0216 19:53:08.576158 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:09 crc kubenswrapper[4675]: E0216 19:53:09.788505 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:09 crc kubenswrapper[4675]: E0216 19:53:09.812844 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:11 crc kubenswrapper[4675]: E0216 19:53:11.003097 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:11 crc kubenswrapper[4675]: E0216 19:53:11.024815 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:12 crc kubenswrapper[4675]: E0216 19:53:12.220982 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:12 crc kubenswrapper[4675]: E0216 19:53:12.245409 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:13 crc kubenswrapper[4675]: E0216 19:53:13.430484 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:13 crc kubenswrapper[4675]: E0216 19:53:13.454582 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:14 crc kubenswrapper[4675]: E0216 19:53:14.680454 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:14 crc kubenswrapper[4675]: E0216 19:53:14.706047 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:15 crc kubenswrapper[4675]: E0216 19:53:15.885562 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:15 crc kubenswrapper[4675]: E0216 19:53:15.909494 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:17 crc kubenswrapper[4675]: E0216 19:53:17.226117 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:17 crc kubenswrapper[4675]: E0216 19:53:17.242220 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:18 crc kubenswrapper[4675]: E0216 19:53:18.444401 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:18 crc kubenswrapper[4675]: E0216 19:53:18.461023 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:19 crc kubenswrapper[4675]: E0216 19:53:19.609081 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:19 crc kubenswrapper[4675]: E0216 19:53:19.635509 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:20 crc kubenswrapper[4675]: E0216 19:53:20.839800 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:20 crc kubenswrapper[4675]: E0216 19:53:20.861795 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:22 crc kubenswrapper[4675]: E0216 19:53:22.058614 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:22 crc kubenswrapper[4675]: E0216 19:53:22.077297 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:23 crc kubenswrapper[4675]: E0216 19:53:23.240567 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:23 crc kubenswrapper[4675]: E0216 19:53:23.265360 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:24 crc kubenswrapper[4675]: E0216 19:53:24.423221 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:24 crc kubenswrapper[4675]: E0216 19:53:24.444064 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:25 crc kubenswrapper[4675]: E0216 19:53:25.656961 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:25 crc kubenswrapper[4675]: E0216 19:53:25.678988 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:26 crc kubenswrapper[4675]: E0216 19:53:26.882901 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:26 crc kubenswrapper[4675]: E0216 19:53:26.907534 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:28 crc kubenswrapper[4675]: E0216 19:53:28.094725 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:28 crc kubenswrapper[4675]: E0216 19:53:28.110644 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:29 crc kubenswrapper[4675]: E0216 19:53:29.295306 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:29 crc kubenswrapper[4675]: E0216 19:53:29.313866 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:30 crc kubenswrapper[4675]: E0216 19:53:30.475585 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:30 crc kubenswrapper[4675]: E0216 19:53:30.499670 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:31 crc kubenswrapper[4675]: E0216 19:53:31.653413 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:31 crc kubenswrapper[4675]: E0216 19:53:31.673832 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:32 crc kubenswrapper[4675]: E0216 19:53:32.897510 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:32 crc kubenswrapper[4675]: E0216 19:53:32.919998 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:34 crc kubenswrapper[4675]: E0216 19:53:34.127487 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:34 crc kubenswrapper[4675]: E0216 19:53:34.149823 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:35 crc kubenswrapper[4675]: E0216 19:53:35.345171 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:35 crc kubenswrapper[4675]: E0216 19:53:35.368649 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:36 crc kubenswrapper[4675]: E0216 19:53:36.583775 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:36 crc kubenswrapper[4675]: E0216 19:53:36.607384 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:37 crc kubenswrapper[4675]: E0216 19:53:37.821816 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:37 crc kubenswrapper[4675]: E0216 19:53:37.845733 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:39 crc kubenswrapper[4675]: E0216 19:53:39.059312 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:39 crc kubenswrapper[4675]: E0216 19:53:39.081053 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:40 crc kubenswrapper[4675]: E0216 19:53:40.269925 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:40 crc kubenswrapper[4675]: E0216 19:53:40.291528 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:41 crc kubenswrapper[4675]: E0216 19:53:41.482533 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:41 crc kubenswrapper[4675]: E0216 19:53:41.503341 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:42 crc kubenswrapper[4675]: E0216 19:53:42.671418 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:42 crc kubenswrapper[4675]: E0216 19:53:42.694194 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:43 crc kubenswrapper[4675]: E0216 19:53:43.874158 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:43 crc kubenswrapper[4675]: E0216 19:53:43.897293 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:45 crc kubenswrapper[4675]: E0216 19:53:45.102805 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:45 crc kubenswrapper[4675]: E0216 19:53:45.118034 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:46 crc kubenswrapper[4675]: E0216 19:53:46.314616 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:46 crc kubenswrapper[4675]: E0216 19:53:46.336543 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:47 crc kubenswrapper[4675]: E0216 19:53:47.533634 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:47 crc kubenswrapper[4675]: I0216 19:53:47.553960 4675 patch_prober.go:28] interesting pod/machine-config-daemon-j7pnb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 16 19:53:47 crc kubenswrapper[4675]: I0216 19:53:47.554055 4675 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" podUID="10414964-83d0-4d95-a89f-e3212a8015b5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 16 19:53:47 crc kubenswrapper[4675]: E0216 19:53:47.556444 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:48 crc kubenswrapper[4675]: E0216 19:53:48.771459 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:48 crc kubenswrapper[4675]: E0216 19:53:48.796197 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:50 crc kubenswrapper[4675]: E0216 19:53:50.007856 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:50 crc kubenswrapper[4675]: E0216 19:53:50.033649 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:51 crc kubenswrapper[4675]: E0216 19:53:51.209751 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:51 crc kubenswrapper[4675]: E0216 19:53:51.232777 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:52 crc kubenswrapper[4675]: E0216 19:53:52.459404 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:52 crc kubenswrapper[4675]: E0216 19:53:52.480113 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:53 crc kubenswrapper[4675]: E0216 19:53:53.654237 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:53 crc kubenswrapper[4675]: E0216 19:53:53.676786 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:54 crc kubenswrapper[4675]: E0216 19:53:54.858359 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:54 crc kubenswrapper[4675]: E0216 19:53:54.879293 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:56 crc kubenswrapper[4675]: E0216 19:53:56.118118 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:56 crc kubenswrapper[4675]: E0216 19:53:56.143065 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:57 crc kubenswrapper[4675]: E0216 19:53:57.300674 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:57 crc kubenswrapper[4675]: E0216 19:53:57.323576 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:58 crc kubenswrapper[4675]: E0216 19:53:58.479877 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:58 crc kubenswrapper[4675]: E0216 19:53:58.503976 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:59 crc kubenswrapper[4675]: E0216 19:53:59.694801 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:53:59 crc kubenswrapper[4675]: E0216 19:53:59.713279 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:54:00 crc kubenswrapper[4675]: E0216 19:54:00.933449 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:54:00 crc kubenswrapper[4675]: E0216 19:54:00.957528 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:54:02 crc kubenswrapper[4675]: E0216 19:54:02.178734 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:54:02 crc kubenswrapper[4675]: E0216 19:54:02.200581 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:54:03 crc kubenswrapper[4675]: E0216 19:54:03.335935 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:54:03 crc kubenswrapper[4675]: E0216 19:54:03.359906 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:54:04 crc kubenswrapper[4675]: E0216 19:54:04.565665 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:54:04 crc kubenswrapper[4675]: E0216 19:54:04.593388 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:54:05 crc kubenswrapper[4675]: E0216 19:54:05.792989 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:54:05 crc kubenswrapper[4675]: E0216 19:54:05.820764 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:54:07 crc kubenswrapper[4675]: E0216 19:54:07.067458 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:54:07 crc kubenswrapper[4675]: E0216 19:54:07.091256 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:54:08 crc kubenswrapper[4675]: E0216 19:54:08.285460 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:54:08 crc kubenswrapper[4675]: E0216 19:54:08.308146 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:54:09 crc kubenswrapper[4675]: E0216 19:54:09.490244 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:54:09 crc kubenswrapper[4675]: E0216 19:54:09.511570 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:54:10 crc kubenswrapper[4675]: E0216 19:54:10.694956 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:54:10 crc kubenswrapper[4675]: E0216 19:54:10.716502 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:54:11 crc kubenswrapper[4675]: E0216 19:54:11.924181 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:54:11 crc kubenswrapper[4675]: E0216 19:54:11.941646 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:54:13 crc kubenswrapper[4675]: E0216 19:54:13.131755 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:54:13 crc kubenswrapper[4675]: E0216 19:54:13.206422 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:54:14 crc kubenswrapper[4675]: E0216 19:54:14.420298 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:54:14 crc kubenswrapper[4675]: E0216 19:54:14.441774 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:54:15 crc kubenswrapper[4675]: E0216 19:54:15.661385 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:54:15 crc kubenswrapper[4675]: E0216 19:54:15.684677 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:54:16 crc kubenswrapper[4675]: E0216 19:54:16.866554 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:54:16 crc kubenswrapper[4675]: E0216 19:54:16.891086 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:54:17 crc kubenswrapper[4675]: I0216 19:54:17.557021 4675 patch_prober.go:28] interesting pod/machine-config-daemon-j7pnb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 16 19:54:17 crc kubenswrapper[4675]: I0216 19:54:17.557115 4675 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" podUID="10414964-83d0-4d95-a89f-e3212a8015b5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 16 19:54:18 crc kubenswrapper[4675]: E0216 19:54:18.041381 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:54:18 crc kubenswrapper[4675]: E0216 19:54:18.057772 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:54:19 crc kubenswrapper[4675]: E0216 19:54:19.289613 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:54:19 crc kubenswrapper[4675]: E0216 19:54:19.311841 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:54:20 crc kubenswrapper[4675]: E0216 19:54:20.557034 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:54:20 crc kubenswrapper[4675]: E0216 19:54:20.577219 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:54:21 crc kubenswrapper[4675]: E0216 19:54:21.796864 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:54:21 crc kubenswrapper[4675]: E0216 19:54:21.819745 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:54:23 crc kubenswrapper[4675]: E0216 19:54:23.028068 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:54:23 crc kubenswrapper[4675]: E0216 19:54:23.083757 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:54:24 crc kubenswrapper[4675]: E0216 19:54:24.301138 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:54:24 crc kubenswrapper[4675]: E0216 19:54:24.324074 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:54:25 crc kubenswrapper[4675]: E0216 19:54:25.475132 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:54:25 crc kubenswrapper[4675]: E0216 19:54:25.492881 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:54:26 crc kubenswrapper[4675]: E0216 19:54:26.707881 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:54:26 crc kubenswrapper[4675]: E0216 19:54:26.731295 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:54:27 crc kubenswrapper[4675]: E0216 19:54:27.928286 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:54:27 crc kubenswrapper[4675]: E0216 19:54:27.946011 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:54:29 crc kubenswrapper[4675]: E0216 19:54:29.165538 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:54:29 crc kubenswrapper[4675]: E0216 19:54:29.193196 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:54:30 crc kubenswrapper[4675]: E0216 19:54:30.376489 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:54:30 crc kubenswrapper[4675]: E0216 19:54:30.402672 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:54:31 crc kubenswrapper[4675]: E0216 19:54:31.612441 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:54:31 crc kubenswrapper[4675]: E0216 19:54:31.633417 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:54:32 crc kubenswrapper[4675]: E0216 19:54:32.815337 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:54:32 crc kubenswrapper[4675]: E0216 19:54:32.838869 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:54:34 crc kubenswrapper[4675]: E0216 19:54:34.066368 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:54:34 crc kubenswrapper[4675]: E0216 19:54:34.088442 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:54:35 crc kubenswrapper[4675]: E0216 19:54:35.306763 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:54:35 crc kubenswrapper[4675]: E0216 19:54:35.329074 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:54:36 crc kubenswrapper[4675]: E0216 19:54:36.507147 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:54:36 crc kubenswrapper[4675]: E0216 19:54:36.528281 4675 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5671133130190374538, SKID=, AKID=2F:41:F6:70:DD:33:F6:9A:E6:AF:90:F2:C4:9D:35:39:0F:7A:BB:71 failed: x509: certificate signed by unknown authority" Feb 16 19:54:36 crc kubenswrapper[4675]: I0216 19:54:36.816556 4675 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 16 19:54:47 crc kubenswrapper[4675]: I0216 19:54:47.557628 4675 patch_prober.go:28] interesting pod/machine-config-daemon-j7pnb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 16 19:54:47 crc kubenswrapper[4675]: I0216 19:54:47.560207 4675 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" podUID="10414964-83d0-4d95-a89f-e3212a8015b5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 16 19:54:47 crc kubenswrapper[4675]: I0216 19:54:47.560558 4675 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" Feb 16 19:54:47 crc kubenswrapper[4675]: I0216 19:54:47.562424 4675 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"55a9711c5328ff5f5c11f4b1a3351942f6dbd223ca91ebbc69660678f4cb0497"} pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 16 19:54:47 crc kubenswrapper[4675]: I0216 19:54:47.562856 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" podUID="10414964-83d0-4d95-a89f-e3212a8015b5" containerName="machine-config-daemon" containerID="cri-o://55a9711c5328ff5f5c11f4b1a3351942f6dbd223ca91ebbc69660678f4cb0497" gracePeriod=600 Feb 16 19:54:48 crc kubenswrapper[4675]: I0216 19:54:48.231972 4675 generic.go:334] "Generic (PLEG): container finished" podID="10414964-83d0-4d95-a89f-e3212a8015b5" containerID="55a9711c5328ff5f5c11f4b1a3351942f6dbd223ca91ebbc69660678f4cb0497" exitCode=0 Feb 16 19:54:48 crc kubenswrapper[4675]: I0216 19:54:48.232170 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" event={"ID":"10414964-83d0-4d95-a89f-e3212a8015b5","Type":"ContainerDied","Data":"55a9711c5328ff5f5c11f4b1a3351942f6dbd223ca91ebbc69660678f4cb0497"} Feb 16 19:54:48 crc kubenswrapper[4675]: I0216 19:54:48.232990 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" event={"ID":"10414964-83d0-4d95-a89f-e3212a8015b5","Type":"ContainerStarted","Data":"5ecd42ebdf737873cb9cde7b416996109a5a4903376f505790f6ade57f8714b6"} Feb 16 19:54:48 crc kubenswrapper[4675]: I0216 19:54:48.233061 4675 scope.go:117] "RemoveContainer" containerID="069e42dd31d4cde076111a9692275d9e8e389cb1174b34be830192e3cff8cc6c" Feb 16 19:55:08 crc kubenswrapper[4675]: I0216 19:55:08.641255 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-wfqzm/must-gather-827k7"] Feb 16 19:55:08 crc kubenswrapper[4675]: I0216 19:55:08.646293 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wfqzm/must-gather-827k7" Feb 16 19:55:08 crc kubenswrapper[4675]: I0216 19:55:08.650001 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-wfqzm"/"kube-root-ca.crt" Feb 16 19:55:08 crc kubenswrapper[4675]: I0216 19:55:08.650303 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-wfqzm"/"openshift-service-ca.crt" Feb 16 19:55:08 crc kubenswrapper[4675]: I0216 19:55:08.675067 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wfqzm/must-gather-827k7"] Feb 16 19:55:08 crc kubenswrapper[4675]: I0216 19:55:08.789973 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwhd5\" (UniqueName: \"kubernetes.io/projected/97914e45-9fe0-4c19-8aae-2ade5f9afd1a-kube-api-access-mwhd5\") pod \"must-gather-827k7\" (UID: \"97914e45-9fe0-4c19-8aae-2ade5f9afd1a\") " pod="openshift-must-gather-wfqzm/must-gather-827k7" Feb 16 19:55:08 crc kubenswrapper[4675]: I0216 19:55:08.790032 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/97914e45-9fe0-4c19-8aae-2ade5f9afd1a-must-gather-output\") pod \"must-gather-827k7\" (UID: \"97914e45-9fe0-4c19-8aae-2ade5f9afd1a\") " pod="openshift-must-gather-wfqzm/must-gather-827k7" Feb 16 19:55:08 crc kubenswrapper[4675]: I0216 19:55:08.891880 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwhd5\" (UniqueName: \"kubernetes.io/projected/97914e45-9fe0-4c19-8aae-2ade5f9afd1a-kube-api-access-mwhd5\") pod \"must-gather-827k7\" (UID: \"97914e45-9fe0-4c19-8aae-2ade5f9afd1a\") " pod="openshift-must-gather-wfqzm/must-gather-827k7" Feb 16 19:55:08 crc kubenswrapper[4675]: I0216 19:55:08.891953 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/97914e45-9fe0-4c19-8aae-2ade5f9afd1a-must-gather-output\") pod \"must-gather-827k7\" (UID: \"97914e45-9fe0-4c19-8aae-2ade5f9afd1a\") " pod="openshift-must-gather-wfqzm/must-gather-827k7" Feb 16 19:55:08 crc kubenswrapper[4675]: I0216 19:55:08.892572 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/97914e45-9fe0-4c19-8aae-2ade5f9afd1a-must-gather-output\") pod \"must-gather-827k7\" (UID: \"97914e45-9fe0-4c19-8aae-2ade5f9afd1a\") " pod="openshift-must-gather-wfqzm/must-gather-827k7" Feb 16 19:55:08 crc kubenswrapper[4675]: I0216 19:55:08.926512 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwhd5\" (UniqueName: \"kubernetes.io/projected/97914e45-9fe0-4c19-8aae-2ade5f9afd1a-kube-api-access-mwhd5\") pod \"must-gather-827k7\" (UID: \"97914e45-9fe0-4c19-8aae-2ade5f9afd1a\") " pod="openshift-must-gather-wfqzm/must-gather-827k7" Feb 16 19:55:08 crc kubenswrapper[4675]: I0216 19:55:08.969022 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wfqzm/must-gather-827k7" Feb 16 19:55:09 crc kubenswrapper[4675]: I0216 19:55:09.445392 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wfqzm/must-gather-827k7"] Feb 16 19:55:10 crc kubenswrapper[4675]: I0216 19:55:10.389957 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wfqzm/must-gather-827k7" event={"ID":"97914e45-9fe0-4c19-8aae-2ade5f9afd1a","Type":"ContainerStarted","Data":"b848e49b4aea066c73dd597b45ed89d646accdb2574963c2ce03edf62d7079b6"} Feb 16 19:55:16 crc kubenswrapper[4675]: I0216 19:55:16.441888 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wfqzm/must-gather-827k7" event={"ID":"97914e45-9fe0-4c19-8aae-2ade5f9afd1a","Type":"ContainerStarted","Data":"536b66f204ebfaf9473337dce75923dca5e7e96d7df52b5f7a3c0913f10bbbe2"} Feb 16 19:55:16 crc kubenswrapper[4675]: I0216 19:55:16.442736 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wfqzm/must-gather-827k7" event={"ID":"97914e45-9fe0-4c19-8aae-2ade5f9afd1a","Type":"ContainerStarted","Data":"a6ba6da673eb1b658d7c83b42fae66a78bc838f18c85356b9c5bcb67ebfb96c5"} Feb 16 19:55:16 crc kubenswrapper[4675]: I0216 19:55:16.468152 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-wfqzm/must-gather-827k7" podStartSLOduration=2.548710152 podStartE2EDuration="8.468121283s" podCreationTimestamp="2026-02-16 19:55:08 +0000 UTC" firstStartedPulling="2026-02-16 19:55:09.455486107 +0000 UTC m=+792.580775693" lastFinishedPulling="2026-02-16 19:55:15.374897238 +0000 UTC m=+798.500186824" observedRunningTime="2026-02-16 19:55:16.465490244 +0000 UTC m=+799.590779840" watchObservedRunningTime="2026-02-16 19:55:16.468121283 +0000 UTC m=+799.593410909" Feb 16 19:55:32 crc kubenswrapper[4675]: I0216 19:55:32.163228 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph_643b8119-31f7-4402-8fd2-c6956598205b/ceph/0.log" Feb 16 19:55:59 crc kubenswrapper[4675]: I0216 19:55:59.558083 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-846bd_02491684-b05b-4878-b335-ba69ffbe08c9/control-plane-machine-set-operator/0.log" Feb 16 19:55:59 crc kubenswrapper[4675]: I0216 19:55:59.741851 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-lj8jk_8de617fc-9ec5-4fd7-81a3-d1c7621bf288/kube-rbac-proxy/0.log" Feb 16 19:55:59 crc kubenswrapper[4675]: I0216 19:55:59.744248 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-lj8jk_8de617fc-9ec5-4fd7-81a3-d1c7621bf288/machine-api-operator/0.log" Feb 16 19:56:13 crc kubenswrapper[4675]: I0216 19:56:13.475762 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-l4grk_226e5ab0-119e-40ef-86f9-e2243de79a09/cert-manager-controller/0.log" Feb 16 19:56:13 crc kubenswrapper[4675]: I0216 19:56:13.614708 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-j4tq2_8acb6f4c-29a5-4b86-bba5-3754ae8ec0ea/cert-manager-cainjector/0.log" Feb 16 19:56:13 crc kubenswrapper[4675]: I0216 19:56:13.690183 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-hmxhx_b779aec3-fe64-4ef2-ba6b-8caf7b98a984/cert-manager-webhook/0.log" Feb 16 19:56:44 crc kubenswrapper[4675]: I0216 19:56:44.148590 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-8hb2k_1984d92c-2f8f-431e-9006-2a8e14bad660/extract-utilities/0.log" Feb 16 19:56:44 crc kubenswrapper[4675]: I0216 19:56:44.279958 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-8hb2k_1984d92c-2f8f-431e-9006-2a8e14bad660/extract-utilities/0.log" Feb 16 19:56:44 crc kubenswrapper[4675]: I0216 19:56:44.297656 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-8hb2k_1984d92c-2f8f-431e-9006-2a8e14bad660/extract-content/0.log" Feb 16 19:56:44 crc kubenswrapper[4675]: I0216 19:56:44.311227 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-8hb2k_1984d92c-2f8f-431e-9006-2a8e14bad660/extract-content/0.log" Feb 16 19:56:44 crc kubenswrapper[4675]: I0216 19:56:44.470210 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-8hb2k_1984d92c-2f8f-431e-9006-2a8e14bad660/extract-content/0.log" Feb 16 19:56:44 crc kubenswrapper[4675]: I0216 19:56:44.483629 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-8hb2k_1984d92c-2f8f-431e-9006-2a8e14bad660/extract-utilities/0.log" Feb 16 19:56:44 crc kubenswrapper[4675]: I0216 19:56:44.574405 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-8hb2k_1984d92c-2f8f-431e-9006-2a8e14bad660/registry-server/0.log" Feb 16 19:56:44 crc kubenswrapper[4675]: I0216 19:56:44.641037 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xxn99_fa11d522-5f3a-456b-b8f4-9e60fd4ad519/extract-utilities/0.log" Feb 16 19:56:44 crc kubenswrapper[4675]: I0216 19:56:44.822463 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xxn99_fa11d522-5f3a-456b-b8f4-9e60fd4ad519/extract-utilities/0.log" Feb 16 19:56:44 crc kubenswrapper[4675]: I0216 19:56:44.852765 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xxn99_fa11d522-5f3a-456b-b8f4-9e60fd4ad519/extract-content/0.log" Feb 16 19:56:44 crc kubenswrapper[4675]: I0216 19:56:44.873032 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xxn99_fa11d522-5f3a-456b-b8f4-9e60fd4ad519/extract-content/0.log" Feb 16 19:56:45 crc kubenswrapper[4675]: I0216 19:56:45.025497 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xxn99_fa11d522-5f3a-456b-b8f4-9e60fd4ad519/extract-utilities/0.log" Feb 16 19:56:45 crc kubenswrapper[4675]: I0216 19:56:45.045511 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xxn99_fa11d522-5f3a-456b-b8f4-9e60fd4ad519/extract-content/0.log" Feb 16 19:56:45 crc kubenswrapper[4675]: I0216 19:56:45.186702 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xxn99_fa11d522-5f3a-456b-b8f4-9e60fd4ad519/registry-server/0.log" Feb 16 19:56:45 crc kubenswrapper[4675]: I0216 19:56:45.236214 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-2pww7_e7e5daa8-b9d8-4f3f-902e-36273ad65acb/marketplace-operator/0.log" Feb 16 19:56:45 crc kubenswrapper[4675]: I0216 19:56:45.342562 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rnmhj_e3ad6b2a-aa4d-45e9-8931-019ca2d29e19/extract-utilities/0.log" Feb 16 19:56:45 crc kubenswrapper[4675]: I0216 19:56:45.492510 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rnmhj_e3ad6b2a-aa4d-45e9-8931-019ca2d29e19/extract-utilities/0.log" Feb 16 19:56:45 crc kubenswrapper[4675]: I0216 19:56:45.499221 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rnmhj_e3ad6b2a-aa4d-45e9-8931-019ca2d29e19/extract-content/0.log" Feb 16 19:56:45 crc kubenswrapper[4675]: I0216 19:56:45.521063 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rnmhj_e3ad6b2a-aa4d-45e9-8931-019ca2d29e19/extract-content/0.log" Feb 16 19:56:45 crc kubenswrapper[4675]: I0216 19:56:45.704495 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rnmhj_e3ad6b2a-aa4d-45e9-8931-019ca2d29e19/extract-utilities/0.log" Feb 16 19:56:45 crc kubenswrapper[4675]: I0216 19:56:45.718304 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rnmhj_e3ad6b2a-aa4d-45e9-8931-019ca2d29e19/extract-content/0.log" Feb 16 19:56:45 crc kubenswrapper[4675]: I0216 19:56:45.766767 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rnmhj_e3ad6b2a-aa4d-45e9-8931-019ca2d29e19/registry-server/0.log" Feb 16 19:56:45 crc kubenswrapper[4675]: I0216 19:56:45.881054 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-wdjm9_fe82aa8d-5f71-47cf-9c1f-6b86a66fe07b/extract-utilities/0.log" Feb 16 19:56:46 crc kubenswrapper[4675]: I0216 19:56:46.081996 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-wdjm9_fe82aa8d-5f71-47cf-9c1f-6b86a66fe07b/extract-content/0.log" Feb 16 19:56:46 crc kubenswrapper[4675]: I0216 19:56:46.081996 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-wdjm9_fe82aa8d-5f71-47cf-9c1f-6b86a66fe07b/extract-content/0.log" Feb 16 19:56:46 crc kubenswrapper[4675]: I0216 19:56:46.083969 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-wdjm9_fe82aa8d-5f71-47cf-9c1f-6b86a66fe07b/extract-utilities/0.log" Feb 16 19:56:46 crc kubenswrapper[4675]: I0216 19:56:46.237742 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-wdjm9_fe82aa8d-5f71-47cf-9c1f-6b86a66fe07b/extract-utilities/0.log" Feb 16 19:56:46 crc kubenswrapper[4675]: I0216 19:56:46.299783 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-wdjm9_fe82aa8d-5f71-47cf-9c1f-6b86a66fe07b/extract-content/0.log" Feb 16 19:56:46 crc kubenswrapper[4675]: I0216 19:56:46.376945 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-wdjm9_fe82aa8d-5f71-47cf-9c1f-6b86a66fe07b/registry-server/0.log" Feb 16 19:56:47 crc kubenswrapper[4675]: I0216 19:56:47.554117 4675 patch_prober.go:28] interesting pod/machine-config-daemon-j7pnb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 16 19:56:47 crc kubenswrapper[4675]: I0216 19:56:47.554201 4675 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" podUID="10414964-83d0-4d95-a89f-e3212a8015b5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 16 19:57:17 crc kubenswrapper[4675]: I0216 19:57:17.554373 4675 patch_prober.go:28] interesting pod/machine-config-daemon-j7pnb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 16 19:57:17 crc kubenswrapper[4675]: I0216 19:57:17.557215 4675 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" podUID="10414964-83d0-4d95-a89f-e3212a8015b5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 16 19:57:24 crc kubenswrapper[4675]: I0216 19:57:24.317761 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zn64z"] Feb 16 19:57:24 crc kubenswrapper[4675]: I0216 19:57:24.320749 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zn64z" Feb 16 19:57:24 crc kubenswrapper[4675]: I0216 19:57:24.326825 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zn64z"] Feb 16 19:57:24 crc kubenswrapper[4675]: I0216 19:57:24.482052 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bj25\" (UniqueName: \"kubernetes.io/projected/f23a915d-2f12-4529-995b-0a103d0f97a2-kube-api-access-9bj25\") pod \"redhat-operators-zn64z\" (UID: \"f23a915d-2f12-4529-995b-0a103d0f97a2\") " pod="openshift-marketplace/redhat-operators-zn64z" Feb 16 19:57:24 crc kubenswrapper[4675]: I0216 19:57:24.482338 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f23a915d-2f12-4529-995b-0a103d0f97a2-catalog-content\") pod \"redhat-operators-zn64z\" (UID: \"f23a915d-2f12-4529-995b-0a103d0f97a2\") " pod="openshift-marketplace/redhat-operators-zn64z" Feb 16 19:57:24 crc kubenswrapper[4675]: I0216 19:57:24.482712 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f23a915d-2f12-4529-995b-0a103d0f97a2-utilities\") pod \"redhat-operators-zn64z\" (UID: \"f23a915d-2f12-4529-995b-0a103d0f97a2\") " pod="openshift-marketplace/redhat-operators-zn64z" Feb 16 19:57:24 crc kubenswrapper[4675]: I0216 19:57:24.584196 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f23a915d-2f12-4529-995b-0a103d0f97a2-utilities\") pod \"redhat-operators-zn64z\" (UID: \"f23a915d-2f12-4529-995b-0a103d0f97a2\") " pod="openshift-marketplace/redhat-operators-zn64z" Feb 16 19:57:24 crc kubenswrapper[4675]: I0216 19:57:24.584319 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bj25\" (UniqueName: \"kubernetes.io/projected/f23a915d-2f12-4529-995b-0a103d0f97a2-kube-api-access-9bj25\") pod \"redhat-operators-zn64z\" (UID: \"f23a915d-2f12-4529-995b-0a103d0f97a2\") " pod="openshift-marketplace/redhat-operators-zn64z" Feb 16 19:57:24 crc kubenswrapper[4675]: I0216 19:57:24.584404 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f23a915d-2f12-4529-995b-0a103d0f97a2-catalog-content\") pod \"redhat-operators-zn64z\" (UID: \"f23a915d-2f12-4529-995b-0a103d0f97a2\") " pod="openshift-marketplace/redhat-operators-zn64z" Feb 16 19:57:24 crc kubenswrapper[4675]: I0216 19:57:24.584976 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f23a915d-2f12-4529-995b-0a103d0f97a2-utilities\") pod \"redhat-operators-zn64z\" (UID: \"f23a915d-2f12-4529-995b-0a103d0f97a2\") " pod="openshift-marketplace/redhat-operators-zn64z" Feb 16 19:57:24 crc kubenswrapper[4675]: I0216 19:57:24.585183 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f23a915d-2f12-4529-995b-0a103d0f97a2-catalog-content\") pod \"redhat-operators-zn64z\" (UID: \"f23a915d-2f12-4529-995b-0a103d0f97a2\") " pod="openshift-marketplace/redhat-operators-zn64z" Feb 16 19:57:24 crc kubenswrapper[4675]: I0216 19:57:24.618012 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bj25\" (UniqueName: \"kubernetes.io/projected/f23a915d-2f12-4529-995b-0a103d0f97a2-kube-api-access-9bj25\") pod \"redhat-operators-zn64z\" (UID: \"f23a915d-2f12-4529-995b-0a103d0f97a2\") " pod="openshift-marketplace/redhat-operators-zn64z" Feb 16 19:57:24 crc kubenswrapper[4675]: I0216 19:57:24.640466 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zn64z" Feb 16 19:57:24 crc kubenswrapper[4675]: I0216 19:57:24.855338 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zn64z"] Feb 16 19:57:25 crc kubenswrapper[4675]: I0216 19:57:25.324980 4675 generic.go:334] "Generic (PLEG): container finished" podID="f23a915d-2f12-4529-995b-0a103d0f97a2" containerID="a392478e9dd6b4b2ba49d2ee42a3f16b4bc4665e2185ba5b35484ad1445bb0bc" exitCode=0 Feb 16 19:57:25 crc kubenswrapper[4675]: I0216 19:57:25.325139 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zn64z" event={"ID":"f23a915d-2f12-4529-995b-0a103d0f97a2","Type":"ContainerDied","Data":"a392478e9dd6b4b2ba49d2ee42a3f16b4bc4665e2185ba5b35484ad1445bb0bc"} Feb 16 19:57:25 crc kubenswrapper[4675]: I0216 19:57:25.325437 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zn64z" event={"ID":"f23a915d-2f12-4529-995b-0a103d0f97a2","Type":"ContainerStarted","Data":"e711e20998d8922d17e1c72e5a47c414785ab75c21feda9e30ea5f962893dfd6"} Feb 16 19:57:25 crc kubenswrapper[4675]: I0216 19:57:25.327127 4675 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 16 19:57:26 crc kubenswrapper[4675]: I0216 19:57:26.333159 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zn64z" event={"ID":"f23a915d-2f12-4529-995b-0a103d0f97a2","Type":"ContainerStarted","Data":"c272e436f8f07731d2fe19faebb55dd2f7b1c13bedb954047d8e33dab8c35941"} Feb 16 19:57:27 crc kubenswrapper[4675]: I0216 19:57:27.344530 4675 generic.go:334] "Generic (PLEG): container finished" podID="f23a915d-2f12-4529-995b-0a103d0f97a2" containerID="c272e436f8f07731d2fe19faebb55dd2f7b1c13bedb954047d8e33dab8c35941" exitCode=0 Feb 16 19:57:27 crc kubenswrapper[4675]: I0216 19:57:27.344593 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zn64z" event={"ID":"f23a915d-2f12-4529-995b-0a103d0f97a2","Type":"ContainerDied","Data":"c272e436f8f07731d2fe19faebb55dd2f7b1c13bedb954047d8e33dab8c35941"} Feb 16 19:57:28 crc kubenswrapper[4675]: I0216 19:57:28.352099 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zn64z" event={"ID":"f23a915d-2f12-4529-995b-0a103d0f97a2","Type":"ContainerStarted","Data":"2353bb199efb2f31e3bdaf37ecd757a234b45cae6444dd65266e6595642e2cd6"} Feb 16 19:57:28 crc kubenswrapper[4675]: I0216 19:57:28.382249 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zn64z" podStartSLOduration=1.91512127 podStartE2EDuration="4.382232235s" podCreationTimestamp="2026-02-16 19:57:24 +0000 UTC" firstStartedPulling="2026-02-16 19:57:25.326857544 +0000 UTC m=+928.452147100" lastFinishedPulling="2026-02-16 19:57:27.793968509 +0000 UTC m=+930.919258065" observedRunningTime="2026-02-16 19:57:28.378842166 +0000 UTC m=+931.504131722" watchObservedRunningTime="2026-02-16 19:57:28.382232235 +0000 UTC m=+931.507521791" Feb 16 19:57:31 crc kubenswrapper[4675]: I0216 19:57:31.949943 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8rsj4"] Feb 16 19:57:31 crc kubenswrapper[4675]: I0216 19:57:31.952031 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8rsj4" Feb 16 19:57:31 crc kubenswrapper[4675]: I0216 19:57:31.980195 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8rsj4"] Feb 16 19:57:32 crc kubenswrapper[4675]: I0216 19:57:32.029069 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0771373e-73e4-439b-82b6-fda6035cb8b3-catalog-content\") pod \"community-operators-8rsj4\" (UID: \"0771373e-73e4-439b-82b6-fda6035cb8b3\") " pod="openshift-marketplace/community-operators-8rsj4" Feb 16 19:57:32 crc kubenswrapper[4675]: I0216 19:57:32.029256 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0771373e-73e4-439b-82b6-fda6035cb8b3-utilities\") pod \"community-operators-8rsj4\" (UID: \"0771373e-73e4-439b-82b6-fda6035cb8b3\") " pod="openshift-marketplace/community-operators-8rsj4" Feb 16 19:57:32 crc kubenswrapper[4675]: I0216 19:57:32.029632 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhqw7\" (UniqueName: \"kubernetes.io/projected/0771373e-73e4-439b-82b6-fda6035cb8b3-kube-api-access-fhqw7\") pod \"community-operators-8rsj4\" (UID: \"0771373e-73e4-439b-82b6-fda6035cb8b3\") " pod="openshift-marketplace/community-operators-8rsj4" Feb 16 19:57:32 crc kubenswrapper[4675]: I0216 19:57:32.130563 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhqw7\" (UniqueName: \"kubernetes.io/projected/0771373e-73e4-439b-82b6-fda6035cb8b3-kube-api-access-fhqw7\") pod \"community-operators-8rsj4\" (UID: \"0771373e-73e4-439b-82b6-fda6035cb8b3\") " pod="openshift-marketplace/community-operators-8rsj4" Feb 16 19:57:32 crc kubenswrapper[4675]: I0216 19:57:32.130681 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0771373e-73e4-439b-82b6-fda6035cb8b3-catalog-content\") pod \"community-operators-8rsj4\" (UID: \"0771373e-73e4-439b-82b6-fda6035cb8b3\") " pod="openshift-marketplace/community-operators-8rsj4" Feb 16 19:57:32 crc kubenswrapper[4675]: I0216 19:57:32.130718 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0771373e-73e4-439b-82b6-fda6035cb8b3-utilities\") pod \"community-operators-8rsj4\" (UID: \"0771373e-73e4-439b-82b6-fda6035cb8b3\") " pod="openshift-marketplace/community-operators-8rsj4" Feb 16 19:57:32 crc kubenswrapper[4675]: I0216 19:57:32.131277 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0771373e-73e4-439b-82b6-fda6035cb8b3-utilities\") pod \"community-operators-8rsj4\" (UID: \"0771373e-73e4-439b-82b6-fda6035cb8b3\") " pod="openshift-marketplace/community-operators-8rsj4" Feb 16 19:57:32 crc kubenswrapper[4675]: I0216 19:57:32.131803 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0771373e-73e4-439b-82b6-fda6035cb8b3-catalog-content\") pod \"community-operators-8rsj4\" (UID: \"0771373e-73e4-439b-82b6-fda6035cb8b3\") " pod="openshift-marketplace/community-operators-8rsj4" Feb 16 19:57:32 crc kubenswrapper[4675]: I0216 19:57:32.176572 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhqw7\" (UniqueName: \"kubernetes.io/projected/0771373e-73e4-439b-82b6-fda6035cb8b3-kube-api-access-fhqw7\") pod \"community-operators-8rsj4\" (UID: \"0771373e-73e4-439b-82b6-fda6035cb8b3\") " pod="openshift-marketplace/community-operators-8rsj4" Feb 16 19:57:32 crc kubenswrapper[4675]: I0216 19:57:32.283204 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8rsj4" Feb 16 19:57:32 crc kubenswrapper[4675]: I0216 19:57:32.611341 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8rsj4"] Feb 16 19:57:32 crc kubenswrapper[4675]: W0216 19:57:32.620070 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0771373e_73e4_439b_82b6_fda6035cb8b3.slice/crio-90b153c1ac86fb5fb99f3c7d00b749a318295d9747e614652cbb295d916e615f WatchSource:0}: Error finding container 90b153c1ac86fb5fb99f3c7d00b749a318295d9747e614652cbb295d916e615f: Status 404 returned error can't find the container with id 90b153c1ac86fb5fb99f3c7d00b749a318295d9747e614652cbb295d916e615f Feb 16 19:57:33 crc kubenswrapper[4675]: I0216 19:57:33.382650 4675 generic.go:334] "Generic (PLEG): container finished" podID="0771373e-73e4-439b-82b6-fda6035cb8b3" containerID="c55ecb6a12849e534a2d34c826ce2803a2a7f3c9805496852a99f55246e52b96" exitCode=0 Feb 16 19:57:33 crc kubenswrapper[4675]: I0216 19:57:33.382766 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8rsj4" event={"ID":"0771373e-73e4-439b-82b6-fda6035cb8b3","Type":"ContainerDied","Data":"c55ecb6a12849e534a2d34c826ce2803a2a7f3c9805496852a99f55246e52b96"} Feb 16 19:57:33 crc kubenswrapper[4675]: I0216 19:57:33.383841 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8rsj4" event={"ID":"0771373e-73e4-439b-82b6-fda6035cb8b3","Type":"ContainerStarted","Data":"90b153c1ac86fb5fb99f3c7d00b749a318295d9747e614652cbb295d916e615f"} Feb 16 19:57:34 crc kubenswrapper[4675]: I0216 19:57:34.391395 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8rsj4" event={"ID":"0771373e-73e4-439b-82b6-fda6035cb8b3","Type":"ContainerStarted","Data":"cfc28368218beb27d25959bf53c41cb8197b495c5d717388e4259cfbffade85b"} Feb 16 19:57:34 crc kubenswrapper[4675]: I0216 19:57:34.641283 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zn64z" Feb 16 19:57:34 crc kubenswrapper[4675]: I0216 19:57:34.641363 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zn64z" Feb 16 19:57:35 crc kubenswrapper[4675]: I0216 19:57:35.402972 4675 generic.go:334] "Generic (PLEG): container finished" podID="0771373e-73e4-439b-82b6-fda6035cb8b3" containerID="cfc28368218beb27d25959bf53c41cb8197b495c5d717388e4259cfbffade85b" exitCode=0 Feb 16 19:57:35 crc kubenswrapper[4675]: I0216 19:57:35.403218 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8rsj4" event={"ID":"0771373e-73e4-439b-82b6-fda6035cb8b3","Type":"ContainerDied","Data":"cfc28368218beb27d25959bf53c41cb8197b495c5d717388e4259cfbffade85b"} Feb 16 19:57:35 crc kubenswrapper[4675]: I0216 19:57:35.712613 4675 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zn64z" podUID="f23a915d-2f12-4529-995b-0a103d0f97a2" containerName="registry-server" probeResult="failure" output=< Feb 16 19:57:35 crc kubenswrapper[4675]: timeout: failed to connect service ":50051" within 1s Feb 16 19:57:35 crc kubenswrapper[4675]: > Feb 16 19:57:35 crc kubenswrapper[4675]: I0216 19:57:35.924878 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-25zqb"] Feb 16 19:57:35 crc kubenswrapper[4675]: I0216 19:57:35.936879 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-25zqb" Feb 16 19:57:35 crc kubenswrapper[4675]: I0216 19:57:35.938765 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-25zqb"] Feb 16 19:57:35 crc kubenswrapper[4675]: I0216 19:57:35.989805 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4944b73-cd02-4a14-9d6f-4793e254e211-catalog-content\") pod \"certified-operators-25zqb\" (UID: \"a4944b73-cd02-4a14-9d6f-4793e254e211\") " pod="openshift-marketplace/certified-operators-25zqb" Feb 16 19:57:35 crc kubenswrapper[4675]: I0216 19:57:35.989921 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hz6r5\" (UniqueName: \"kubernetes.io/projected/a4944b73-cd02-4a14-9d6f-4793e254e211-kube-api-access-hz6r5\") pod \"certified-operators-25zqb\" (UID: \"a4944b73-cd02-4a14-9d6f-4793e254e211\") " pod="openshift-marketplace/certified-operators-25zqb" Feb 16 19:57:35 crc kubenswrapper[4675]: I0216 19:57:35.989998 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4944b73-cd02-4a14-9d6f-4793e254e211-utilities\") pod \"certified-operators-25zqb\" (UID: \"a4944b73-cd02-4a14-9d6f-4793e254e211\") " pod="openshift-marketplace/certified-operators-25zqb" Feb 16 19:57:36 crc kubenswrapper[4675]: I0216 19:57:36.091215 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4944b73-cd02-4a14-9d6f-4793e254e211-utilities\") pod \"certified-operators-25zqb\" (UID: \"a4944b73-cd02-4a14-9d6f-4793e254e211\") " pod="openshift-marketplace/certified-operators-25zqb" Feb 16 19:57:36 crc kubenswrapper[4675]: I0216 19:57:36.091284 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4944b73-cd02-4a14-9d6f-4793e254e211-catalog-content\") pod \"certified-operators-25zqb\" (UID: \"a4944b73-cd02-4a14-9d6f-4793e254e211\") " pod="openshift-marketplace/certified-operators-25zqb" Feb 16 19:57:36 crc kubenswrapper[4675]: I0216 19:57:36.091332 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hz6r5\" (UniqueName: \"kubernetes.io/projected/a4944b73-cd02-4a14-9d6f-4793e254e211-kube-api-access-hz6r5\") pod \"certified-operators-25zqb\" (UID: \"a4944b73-cd02-4a14-9d6f-4793e254e211\") " pod="openshift-marketplace/certified-operators-25zqb" Feb 16 19:57:36 crc kubenswrapper[4675]: I0216 19:57:36.092120 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4944b73-cd02-4a14-9d6f-4793e254e211-utilities\") pod \"certified-operators-25zqb\" (UID: \"a4944b73-cd02-4a14-9d6f-4793e254e211\") " pod="openshift-marketplace/certified-operators-25zqb" Feb 16 19:57:36 crc kubenswrapper[4675]: I0216 19:57:36.092336 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4944b73-cd02-4a14-9d6f-4793e254e211-catalog-content\") pod \"certified-operators-25zqb\" (UID: \"a4944b73-cd02-4a14-9d6f-4793e254e211\") " pod="openshift-marketplace/certified-operators-25zqb" Feb 16 19:57:36 crc kubenswrapper[4675]: I0216 19:57:36.126165 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hz6r5\" (UniqueName: \"kubernetes.io/projected/a4944b73-cd02-4a14-9d6f-4793e254e211-kube-api-access-hz6r5\") pod \"certified-operators-25zqb\" (UID: \"a4944b73-cd02-4a14-9d6f-4793e254e211\") " pod="openshift-marketplace/certified-operators-25zqb" Feb 16 19:57:36 crc kubenswrapper[4675]: I0216 19:57:36.285089 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-25zqb" Feb 16 19:57:36 crc kubenswrapper[4675]: I0216 19:57:36.411703 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8rsj4" event={"ID":"0771373e-73e4-439b-82b6-fda6035cb8b3","Type":"ContainerStarted","Data":"65f91795025ad9dcf98646f7f058b5bc172348fb81299a8b95ad42afeab768c4"} Feb 16 19:57:36 crc kubenswrapper[4675]: I0216 19:57:36.584532 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8rsj4" podStartSLOduration=3.150112708 podStartE2EDuration="5.584510632s" podCreationTimestamp="2026-02-16 19:57:31 +0000 UTC" firstStartedPulling="2026-02-16 19:57:33.38501143 +0000 UTC m=+936.510300976" lastFinishedPulling="2026-02-16 19:57:35.819409304 +0000 UTC m=+938.944698900" observedRunningTime="2026-02-16 19:57:36.434173437 +0000 UTC m=+939.559462993" watchObservedRunningTime="2026-02-16 19:57:36.584510632 +0000 UTC m=+939.709800188" Feb 16 19:57:36 crc kubenswrapper[4675]: I0216 19:57:36.585335 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-25zqb"] Feb 16 19:57:36 crc kubenswrapper[4675]: W0216 19:57:36.591539 4675 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4944b73_cd02_4a14_9d6f_4793e254e211.slice/crio-6d410a971f9d76b2774ef2ca4f6f2aca3661eee929b8ef2dacfbe14c58115c52 WatchSource:0}: Error finding container 6d410a971f9d76b2774ef2ca4f6f2aca3661eee929b8ef2dacfbe14c58115c52: Status 404 returned error can't find the container with id 6d410a971f9d76b2774ef2ca4f6f2aca3661eee929b8ef2dacfbe14c58115c52 Feb 16 19:57:37 crc kubenswrapper[4675]: I0216 19:57:37.419344 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-25zqb" event={"ID":"a4944b73-cd02-4a14-9d6f-4793e254e211","Type":"ContainerStarted","Data":"1c2f4e3a4afb6bd3dfd6b77e2f2a92ccfb6e7d2ab9d52c35f1184c68f301003c"} Feb 16 19:57:37 crc kubenswrapper[4675]: I0216 19:57:37.419833 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-25zqb" event={"ID":"a4944b73-cd02-4a14-9d6f-4793e254e211","Type":"ContainerStarted","Data":"6d410a971f9d76b2774ef2ca4f6f2aca3661eee929b8ef2dacfbe14c58115c52"} Feb 16 19:57:38 crc kubenswrapper[4675]: I0216 19:57:38.431226 4675 generic.go:334] "Generic (PLEG): container finished" podID="a4944b73-cd02-4a14-9d6f-4793e254e211" containerID="1c2f4e3a4afb6bd3dfd6b77e2f2a92ccfb6e7d2ab9d52c35f1184c68f301003c" exitCode=0 Feb 16 19:57:38 crc kubenswrapper[4675]: I0216 19:57:38.432188 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-25zqb" event={"ID":"a4944b73-cd02-4a14-9d6f-4793e254e211","Type":"ContainerDied","Data":"1c2f4e3a4afb6bd3dfd6b77e2f2a92ccfb6e7d2ab9d52c35f1184c68f301003c"} Feb 16 19:57:42 crc kubenswrapper[4675]: I0216 19:57:42.284052 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8rsj4" Feb 16 19:57:42 crc kubenswrapper[4675]: I0216 19:57:42.284878 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8rsj4" Feb 16 19:57:42 crc kubenswrapper[4675]: I0216 19:57:42.346117 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8rsj4" Feb 16 19:57:42 crc kubenswrapper[4675]: I0216 19:57:42.511283 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8rsj4" Feb 16 19:57:42 crc kubenswrapper[4675]: I0216 19:57:42.586482 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8rsj4"] Feb 16 19:57:43 crc kubenswrapper[4675]: I0216 19:57:43.464172 4675 generic.go:334] "Generic (PLEG): container finished" podID="a4944b73-cd02-4a14-9d6f-4793e254e211" containerID="480594b7bd15f41afebe6dc550308c35c2b53a62525d1607ca152588f8f50749" exitCode=0 Feb 16 19:57:43 crc kubenswrapper[4675]: I0216 19:57:43.464245 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-25zqb" event={"ID":"a4944b73-cd02-4a14-9d6f-4793e254e211","Type":"ContainerDied","Data":"480594b7bd15f41afebe6dc550308c35c2b53a62525d1607ca152588f8f50749"} Feb 16 19:57:44 crc kubenswrapper[4675]: I0216 19:57:44.475528 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-25zqb" event={"ID":"a4944b73-cd02-4a14-9d6f-4793e254e211","Type":"ContainerStarted","Data":"531979dd7191aeaf3b36d53a242d17231b51b9b9d1fe68cdf1c9d5a10d8ddf0e"} Feb 16 19:57:44 crc kubenswrapper[4675]: I0216 19:57:44.477043 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8rsj4" podUID="0771373e-73e4-439b-82b6-fda6035cb8b3" containerName="registry-server" containerID="cri-o://65f91795025ad9dcf98646f7f058b5bc172348fb81299a8b95ad42afeab768c4" gracePeriod=2 Feb 16 19:57:44 crc kubenswrapper[4675]: I0216 19:57:44.519276 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-25zqb" podStartSLOduration=4.058683354 podStartE2EDuration="9.519241761s" podCreationTimestamp="2026-02-16 19:57:35 +0000 UTC" firstStartedPulling="2026-02-16 19:57:38.433945968 +0000 UTC m=+941.559235554" lastFinishedPulling="2026-02-16 19:57:43.894504375 +0000 UTC m=+947.019793961" observedRunningTime="2026-02-16 19:57:44.50854316 +0000 UTC m=+947.633832736" watchObservedRunningTime="2026-02-16 19:57:44.519241761 +0000 UTC m=+947.644531347" Feb 16 19:57:44 crc kubenswrapper[4675]: I0216 19:57:44.713490 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zn64z" Feb 16 19:57:44 crc kubenswrapper[4675]: I0216 19:57:44.788425 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zn64z" Feb 16 19:57:44 crc kubenswrapper[4675]: I0216 19:57:44.948120 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8rsj4" Feb 16 19:57:45 crc kubenswrapper[4675]: I0216 19:57:45.015151 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0771373e-73e4-439b-82b6-fda6035cb8b3-catalog-content\") pod \"0771373e-73e4-439b-82b6-fda6035cb8b3\" (UID: \"0771373e-73e4-439b-82b6-fda6035cb8b3\") " Feb 16 19:57:45 crc kubenswrapper[4675]: I0216 19:57:45.015224 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fhqw7\" (UniqueName: \"kubernetes.io/projected/0771373e-73e4-439b-82b6-fda6035cb8b3-kube-api-access-fhqw7\") pod \"0771373e-73e4-439b-82b6-fda6035cb8b3\" (UID: \"0771373e-73e4-439b-82b6-fda6035cb8b3\") " Feb 16 19:57:45 crc kubenswrapper[4675]: I0216 19:57:45.015330 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0771373e-73e4-439b-82b6-fda6035cb8b3-utilities\") pod \"0771373e-73e4-439b-82b6-fda6035cb8b3\" (UID: \"0771373e-73e4-439b-82b6-fda6035cb8b3\") " Feb 16 19:57:45 crc kubenswrapper[4675]: I0216 19:57:45.016388 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0771373e-73e4-439b-82b6-fda6035cb8b3-utilities" (OuterVolumeSpecName: "utilities") pod "0771373e-73e4-439b-82b6-fda6035cb8b3" (UID: "0771373e-73e4-439b-82b6-fda6035cb8b3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 19:57:45 crc kubenswrapper[4675]: I0216 19:57:45.046371 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0771373e-73e4-439b-82b6-fda6035cb8b3-kube-api-access-fhqw7" (OuterVolumeSpecName: "kube-api-access-fhqw7") pod "0771373e-73e4-439b-82b6-fda6035cb8b3" (UID: "0771373e-73e4-439b-82b6-fda6035cb8b3"). InnerVolumeSpecName "kube-api-access-fhqw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 19:57:45 crc kubenswrapper[4675]: I0216 19:57:45.077294 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0771373e-73e4-439b-82b6-fda6035cb8b3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0771373e-73e4-439b-82b6-fda6035cb8b3" (UID: "0771373e-73e4-439b-82b6-fda6035cb8b3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 19:57:45 crc kubenswrapper[4675]: I0216 19:57:45.116381 4675 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0771373e-73e4-439b-82b6-fda6035cb8b3-utilities\") on node \"crc\" DevicePath \"\"" Feb 16 19:57:45 crc kubenswrapper[4675]: I0216 19:57:45.116440 4675 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0771373e-73e4-439b-82b6-fda6035cb8b3-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 16 19:57:45 crc kubenswrapper[4675]: I0216 19:57:45.116457 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fhqw7\" (UniqueName: \"kubernetes.io/projected/0771373e-73e4-439b-82b6-fda6035cb8b3-kube-api-access-fhqw7\") on node \"crc\" DevicePath \"\"" Feb 16 19:57:45 crc kubenswrapper[4675]: I0216 19:57:45.486881 4675 generic.go:334] "Generic (PLEG): container finished" podID="0771373e-73e4-439b-82b6-fda6035cb8b3" containerID="65f91795025ad9dcf98646f7f058b5bc172348fb81299a8b95ad42afeab768c4" exitCode=0 Feb 16 19:57:45 crc kubenswrapper[4675]: I0216 19:57:45.487018 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8rsj4" event={"ID":"0771373e-73e4-439b-82b6-fda6035cb8b3","Type":"ContainerDied","Data":"65f91795025ad9dcf98646f7f058b5bc172348fb81299a8b95ad42afeab768c4"} Feb 16 19:57:45 crc kubenswrapper[4675]: I0216 19:57:45.487097 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8rsj4" event={"ID":"0771373e-73e4-439b-82b6-fda6035cb8b3","Type":"ContainerDied","Data":"90b153c1ac86fb5fb99f3c7d00b749a318295d9747e614652cbb295d916e615f"} Feb 16 19:57:45 crc kubenswrapper[4675]: I0216 19:57:45.487125 4675 scope.go:117] "RemoveContainer" containerID="65f91795025ad9dcf98646f7f058b5bc172348fb81299a8b95ad42afeab768c4" Feb 16 19:57:45 crc kubenswrapper[4675]: I0216 19:57:45.488090 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8rsj4" Feb 16 19:57:45 crc kubenswrapper[4675]: I0216 19:57:45.514883 4675 scope.go:117] "RemoveContainer" containerID="cfc28368218beb27d25959bf53c41cb8197b495c5d717388e4259cfbffade85b" Feb 16 19:57:45 crc kubenswrapper[4675]: I0216 19:57:45.532406 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8rsj4"] Feb 16 19:57:45 crc kubenswrapper[4675]: I0216 19:57:45.535320 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8rsj4"] Feb 16 19:57:45 crc kubenswrapper[4675]: I0216 19:57:45.543203 4675 scope.go:117] "RemoveContainer" containerID="c55ecb6a12849e534a2d34c826ce2803a2a7f3c9805496852a99f55246e52b96" Feb 16 19:57:45 crc kubenswrapper[4675]: I0216 19:57:45.575364 4675 scope.go:117] "RemoveContainer" containerID="65f91795025ad9dcf98646f7f058b5bc172348fb81299a8b95ad42afeab768c4" Feb 16 19:57:45 crc kubenswrapper[4675]: E0216 19:57:45.576964 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65f91795025ad9dcf98646f7f058b5bc172348fb81299a8b95ad42afeab768c4\": container with ID starting with 65f91795025ad9dcf98646f7f058b5bc172348fb81299a8b95ad42afeab768c4 not found: ID does not exist" containerID="65f91795025ad9dcf98646f7f058b5bc172348fb81299a8b95ad42afeab768c4" Feb 16 19:57:45 crc kubenswrapper[4675]: I0216 19:57:45.577019 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65f91795025ad9dcf98646f7f058b5bc172348fb81299a8b95ad42afeab768c4"} err="failed to get container status \"65f91795025ad9dcf98646f7f058b5bc172348fb81299a8b95ad42afeab768c4\": rpc error: code = NotFound desc = could not find container \"65f91795025ad9dcf98646f7f058b5bc172348fb81299a8b95ad42afeab768c4\": container with ID starting with 65f91795025ad9dcf98646f7f058b5bc172348fb81299a8b95ad42afeab768c4 not found: ID does not exist" Feb 16 19:57:45 crc kubenswrapper[4675]: I0216 19:57:45.577050 4675 scope.go:117] "RemoveContainer" containerID="cfc28368218beb27d25959bf53c41cb8197b495c5d717388e4259cfbffade85b" Feb 16 19:57:45 crc kubenswrapper[4675]: E0216 19:57:45.577533 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfc28368218beb27d25959bf53c41cb8197b495c5d717388e4259cfbffade85b\": container with ID starting with cfc28368218beb27d25959bf53c41cb8197b495c5d717388e4259cfbffade85b not found: ID does not exist" containerID="cfc28368218beb27d25959bf53c41cb8197b495c5d717388e4259cfbffade85b" Feb 16 19:57:45 crc kubenswrapper[4675]: I0216 19:57:45.577595 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfc28368218beb27d25959bf53c41cb8197b495c5d717388e4259cfbffade85b"} err="failed to get container status \"cfc28368218beb27d25959bf53c41cb8197b495c5d717388e4259cfbffade85b\": rpc error: code = NotFound desc = could not find container \"cfc28368218beb27d25959bf53c41cb8197b495c5d717388e4259cfbffade85b\": container with ID starting with cfc28368218beb27d25959bf53c41cb8197b495c5d717388e4259cfbffade85b not found: ID does not exist" Feb 16 19:57:45 crc kubenswrapper[4675]: I0216 19:57:45.577729 4675 scope.go:117] "RemoveContainer" containerID="c55ecb6a12849e534a2d34c826ce2803a2a7f3c9805496852a99f55246e52b96" Feb 16 19:57:45 crc kubenswrapper[4675]: E0216 19:57:45.578074 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c55ecb6a12849e534a2d34c826ce2803a2a7f3c9805496852a99f55246e52b96\": container with ID starting with c55ecb6a12849e534a2d34c826ce2803a2a7f3c9805496852a99f55246e52b96 not found: ID does not exist" containerID="c55ecb6a12849e534a2d34c826ce2803a2a7f3c9805496852a99f55246e52b96" Feb 16 19:57:45 crc kubenswrapper[4675]: I0216 19:57:45.578107 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c55ecb6a12849e534a2d34c826ce2803a2a7f3c9805496852a99f55246e52b96"} err="failed to get container status \"c55ecb6a12849e534a2d34c826ce2803a2a7f3c9805496852a99f55246e52b96\": rpc error: code = NotFound desc = could not find container \"c55ecb6a12849e534a2d34c826ce2803a2a7f3c9805496852a99f55246e52b96\": container with ID starting with c55ecb6a12849e534a2d34c826ce2803a2a7f3c9805496852a99f55246e52b96 not found: ID does not exist" Feb 16 19:57:45 crc kubenswrapper[4675]: I0216 19:57:45.893653 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0771373e-73e4-439b-82b6-fda6035cb8b3" path="/var/lib/kubelet/pods/0771373e-73e4-439b-82b6-fda6035cb8b3/volumes" Feb 16 19:57:46 crc kubenswrapper[4675]: I0216 19:57:46.285998 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-25zqb" Feb 16 19:57:46 crc kubenswrapper[4675]: I0216 19:57:46.286082 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-25zqb" Feb 16 19:57:46 crc kubenswrapper[4675]: I0216 19:57:46.365510 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-25zqb" Feb 16 19:57:46 crc kubenswrapper[4675]: I0216 19:57:46.989058 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zn64z"] Feb 16 19:57:46 crc kubenswrapper[4675]: I0216 19:57:46.989365 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zn64z" podUID="f23a915d-2f12-4529-995b-0a103d0f97a2" containerName="registry-server" containerID="cri-o://2353bb199efb2f31e3bdaf37ecd757a234b45cae6444dd65266e6595642e2cd6" gracePeriod=2 Feb 16 19:57:47 crc kubenswrapper[4675]: I0216 19:57:47.451928 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zn64z" Feb 16 19:57:47 crc kubenswrapper[4675]: I0216 19:57:47.469422 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f23a915d-2f12-4529-995b-0a103d0f97a2-utilities\") pod \"f23a915d-2f12-4529-995b-0a103d0f97a2\" (UID: \"f23a915d-2f12-4529-995b-0a103d0f97a2\") " Feb 16 19:57:47 crc kubenswrapper[4675]: I0216 19:57:47.469583 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9bj25\" (UniqueName: \"kubernetes.io/projected/f23a915d-2f12-4529-995b-0a103d0f97a2-kube-api-access-9bj25\") pod \"f23a915d-2f12-4529-995b-0a103d0f97a2\" (UID: \"f23a915d-2f12-4529-995b-0a103d0f97a2\") " Feb 16 19:57:47 crc kubenswrapper[4675]: I0216 19:57:47.469635 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f23a915d-2f12-4529-995b-0a103d0f97a2-catalog-content\") pod \"f23a915d-2f12-4529-995b-0a103d0f97a2\" (UID: \"f23a915d-2f12-4529-995b-0a103d0f97a2\") " Feb 16 19:57:47 crc kubenswrapper[4675]: I0216 19:57:47.470311 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f23a915d-2f12-4529-995b-0a103d0f97a2-utilities" (OuterVolumeSpecName: "utilities") pod "f23a915d-2f12-4529-995b-0a103d0f97a2" (UID: "f23a915d-2f12-4529-995b-0a103d0f97a2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 19:57:47 crc kubenswrapper[4675]: I0216 19:57:47.479643 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f23a915d-2f12-4529-995b-0a103d0f97a2-kube-api-access-9bj25" (OuterVolumeSpecName: "kube-api-access-9bj25") pod "f23a915d-2f12-4529-995b-0a103d0f97a2" (UID: "f23a915d-2f12-4529-995b-0a103d0f97a2"). InnerVolumeSpecName "kube-api-access-9bj25". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 19:57:47 crc kubenswrapper[4675]: I0216 19:57:47.503574 4675 generic.go:334] "Generic (PLEG): container finished" podID="f23a915d-2f12-4529-995b-0a103d0f97a2" containerID="2353bb199efb2f31e3bdaf37ecd757a234b45cae6444dd65266e6595642e2cd6" exitCode=0 Feb 16 19:57:47 crc kubenswrapper[4675]: I0216 19:57:47.504742 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zn64z" Feb 16 19:57:47 crc kubenswrapper[4675]: I0216 19:57:47.504772 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zn64z" event={"ID":"f23a915d-2f12-4529-995b-0a103d0f97a2","Type":"ContainerDied","Data":"2353bb199efb2f31e3bdaf37ecd757a234b45cae6444dd65266e6595642e2cd6"} Feb 16 19:57:47 crc kubenswrapper[4675]: I0216 19:57:47.504814 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zn64z" event={"ID":"f23a915d-2f12-4529-995b-0a103d0f97a2","Type":"ContainerDied","Data":"e711e20998d8922d17e1c72e5a47c414785ab75c21feda9e30ea5f962893dfd6"} Feb 16 19:57:47 crc kubenswrapper[4675]: I0216 19:57:47.504838 4675 scope.go:117] "RemoveContainer" containerID="2353bb199efb2f31e3bdaf37ecd757a234b45cae6444dd65266e6595642e2cd6" Feb 16 19:57:47 crc kubenswrapper[4675]: I0216 19:57:47.529637 4675 scope.go:117] "RemoveContainer" containerID="c272e436f8f07731d2fe19faebb55dd2f7b1c13bedb954047d8e33dab8c35941" Feb 16 19:57:47 crc kubenswrapper[4675]: I0216 19:57:47.554209 4675 scope.go:117] "RemoveContainer" containerID="a392478e9dd6b4b2ba49d2ee42a3f16b4bc4665e2185ba5b35484ad1445bb0bc" Feb 16 19:57:47 crc kubenswrapper[4675]: I0216 19:57:47.554326 4675 patch_prober.go:28] interesting pod/machine-config-daemon-j7pnb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 16 19:57:47 crc kubenswrapper[4675]: I0216 19:57:47.554356 4675 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" podUID="10414964-83d0-4d95-a89f-e3212a8015b5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 16 19:57:47 crc kubenswrapper[4675]: I0216 19:57:47.554397 4675 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" Feb 16 19:57:47 crc kubenswrapper[4675]: I0216 19:57:47.555034 4675 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5ecd42ebdf737873cb9cde7b416996109a5a4903376f505790f6ade57f8714b6"} pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 16 19:57:47 crc kubenswrapper[4675]: I0216 19:57:47.555088 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" podUID="10414964-83d0-4d95-a89f-e3212a8015b5" containerName="machine-config-daemon" containerID="cri-o://5ecd42ebdf737873cb9cde7b416996109a5a4903376f505790f6ade57f8714b6" gracePeriod=600 Feb 16 19:57:47 crc kubenswrapper[4675]: I0216 19:57:47.571774 4675 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f23a915d-2f12-4529-995b-0a103d0f97a2-utilities\") on node \"crc\" DevicePath \"\"" Feb 16 19:57:47 crc kubenswrapper[4675]: I0216 19:57:47.571804 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9bj25\" (UniqueName: \"kubernetes.io/projected/f23a915d-2f12-4529-995b-0a103d0f97a2-kube-api-access-9bj25\") on node \"crc\" DevicePath \"\"" Feb 16 19:57:47 crc kubenswrapper[4675]: I0216 19:57:47.575699 4675 scope.go:117] "RemoveContainer" containerID="2353bb199efb2f31e3bdaf37ecd757a234b45cae6444dd65266e6595642e2cd6" Feb 16 19:57:47 crc kubenswrapper[4675]: E0216 19:57:47.576436 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2353bb199efb2f31e3bdaf37ecd757a234b45cae6444dd65266e6595642e2cd6\": container with ID starting with 2353bb199efb2f31e3bdaf37ecd757a234b45cae6444dd65266e6595642e2cd6 not found: ID does not exist" containerID="2353bb199efb2f31e3bdaf37ecd757a234b45cae6444dd65266e6595642e2cd6" Feb 16 19:57:47 crc kubenswrapper[4675]: I0216 19:57:47.576491 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2353bb199efb2f31e3bdaf37ecd757a234b45cae6444dd65266e6595642e2cd6"} err="failed to get container status \"2353bb199efb2f31e3bdaf37ecd757a234b45cae6444dd65266e6595642e2cd6\": rpc error: code = NotFound desc = could not find container \"2353bb199efb2f31e3bdaf37ecd757a234b45cae6444dd65266e6595642e2cd6\": container with ID starting with 2353bb199efb2f31e3bdaf37ecd757a234b45cae6444dd65266e6595642e2cd6 not found: ID does not exist" Feb 16 19:57:47 crc kubenswrapper[4675]: I0216 19:57:47.576528 4675 scope.go:117] "RemoveContainer" containerID="c272e436f8f07731d2fe19faebb55dd2f7b1c13bedb954047d8e33dab8c35941" Feb 16 19:57:47 crc kubenswrapper[4675]: E0216 19:57:47.577033 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c272e436f8f07731d2fe19faebb55dd2f7b1c13bedb954047d8e33dab8c35941\": container with ID starting with c272e436f8f07731d2fe19faebb55dd2f7b1c13bedb954047d8e33dab8c35941 not found: ID does not exist" containerID="c272e436f8f07731d2fe19faebb55dd2f7b1c13bedb954047d8e33dab8c35941" Feb 16 19:57:47 crc kubenswrapper[4675]: I0216 19:57:47.577078 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c272e436f8f07731d2fe19faebb55dd2f7b1c13bedb954047d8e33dab8c35941"} err="failed to get container status \"c272e436f8f07731d2fe19faebb55dd2f7b1c13bedb954047d8e33dab8c35941\": rpc error: code = NotFound desc = could not find container \"c272e436f8f07731d2fe19faebb55dd2f7b1c13bedb954047d8e33dab8c35941\": container with ID starting with c272e436f8f07731d2fe19faebb55dd2f7b1c13bedb954047d8e33dab8c35941 not found: ID does not exist" Feb 16 19:57:47 crc kubenswrapper[4675]: I0216 19:57:47.577096 4675 scope.go:117] "RemoveContainer" containerID="a392478e9dd6b4b2ba49d2ee42a3f16b4bc4665e2185ba5b35484ad1445bb0bc" Feb 16 19:57:47 crc kubenswrapper[4675]: E0216 19:57:47.577392 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a392478e9dd6b4b2ba49d2ee42a3f16b4bc4665e2185ba5b35484ad1445bb0bc\": container with ID starting with a392478e9dd6b4b2ba49d2ee42a3f16b4bc4665e2185ba5b35484ad1445bb0bc not found: ID does not exist" containerID="a392478e9dd6b4b2ba49d2ee42a3f16b4bc4665e2185ba5b35484ad1445bb0bc" Feb 16 19:57:47 crc kubenswrapper[4675]: I0216 19:57:47.577415 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a392478e9dd6b4b2ba49d2ee42a3f16b4bc4665e2185ba5b35484ad1445bb0bc"} err="failed to get container status \"a392478e9dd6b4b2ba49d2ee42a3f16b4bc4665e2185ba5b35484ad1445bb0bc\": rpc error: code = NotFound desc = could not find container \"a392478e9dd6b4b2ba49d2ee42a3f16b4bc4665e2185ba5b35484ad1445bb0bc\": container with ID starting with a392478e9dd6b4b2ba49d2ee42a3f16b4bc4665e2185ba5b35484ad1445bb0bc not found: ID does not exist" Feb 16 19:57:47 crc kubenswrapper[4675]: I0216 19:57:47.632048 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f23a915d-2f12-4529-995b-0a103d0f97a2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f23a915d-2f12-4529-995b-0a103d0f97a2" (UID: "f23a915d-2f12-4529-995b-0a103d0f97a2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 19:57:47 crc kubenswrapper[4675]: I0216 19:57:47.672908 4675 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f23a915d-2f12-4529-995b-0a103d0f97a2-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 16 19:57:47 crc kubenswrapper[4675]: I0216 19:57:47.864535 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zn64z"] Feb 16 19:57:47 crc kubenswrapper[4675]: I0216 19:57:47.869183 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zn64z"] Feb 16 19:57:47 crc kubenswrapper[4675]: I0216 19:57:47.896143 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f23a915d-2f12-4529-995b-0a103d0f97a2" path="/var/lib/kubelet/pods/f23a915d-2f12-4529-995b-0a103d0f97a2/volumes" Feb 16 19:57:48 crc kubenswrapper[4675]: I0216 19:57:48.527326 4675 generic.go:334] "Generic (PLEG): container finished" podID="10414964-83d0-4d95-a89f-e3212a8015b5" containerID="5ecd42ebdf737873cb9cde7b416996109a5a4903376f505790f6ade57f8714b6" exitCode=0 Feb 16 19:57:48 crc kubenswrapper[4675]: I0216 19:57:48.527367 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" event={"ID":"10414964-83d0-4d95-a89f-e3212a8015b5","Type":"ContainerDied","Data":"5ecd42ebdf737873cb9cde7b416996109a5a4903376f505790f6ade57f8714b6"} Feb 16 19:57:48 crc kubenswrapper[4675]: I0216 19:57:48.527918 4675 scope.go:117] "RemoveContainer" containerID="55a9711c5328ff5f5c11f4b1a3351942f6dbd223ca91ebbc69660678f4cb0497" Feb 16 19:57:49 crc kubenswrapper[4675]: I0216 19:57:49.538100 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" event={"ID":"10414964-83d0-4d95-a89f-e3212a8015b5","Type":"ContainerStarted","Data":"ab7abda2b5b56b56a782fcc31aab20f6680bfd7debcd351ca4e30b4d356c1611"} Feb 16 19:57:52 crc kubenswrapper[4675]: I0216 19:57:52.727677 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-kb5mt"] Feb 16 19:57:52 crc kubenswrapper[4675]: E0216 19:57:52.729619 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0771373e-73e4-439b-82b6-fda6035cb8b3" containerName="extract-content" Feb 16 19:57:52 crc kubenswrapper[4675]: I0216 19:57:52.729821 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="0771373e-73e4-439b-82b6-fda6035cb8b3" containerName="extract-content" Feb 16 19:57:52 crc kubenswrapper[4675]: E0216 19:57:52.729962 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f23a915d-2f12-4529-995b-0a103d0f97a2" containerName="extract-utilities" Feb 16 19:57:52 crc kubenswrapper[4675]: I0216 19:57:52.730098 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="f23a915d-2f12-4529-995b-0a103d0f97a2" containerName="extract-utilities" Feb 16 19:57:52 crc kubenswrapper[4675]: E0216 19:57:52.730222 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f23a915d-2f12-4529-995b-0a103d0f97a2" containerName="registry-server" Feb 16 19:57:52 crc kubenswrapper[4675]: I0216 19:57:52.730325 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="f23a915d-2f12-4529-995b-0a103d0f97a2" containerName="registry-server" Feb 16 19:57:52 crc kubenswrapper[4675]: E0216 19:57:52.731240 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f23a915d-2f12-4529-995b-0a103d0f97a2" containerName="extract-content" Feb 16 19:57:52 crc kubenswrapper[4675]: I0216 19:57:52.731370 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="f23a915d-2f12-4529-995b-0a103d0f97a2" containerName="extract-content" Feb 16 19:57:52 crc kubenswrapper[4675]: E0216 19:57:52.731487 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0771373e-73e4-439b-82b6-fda6035cb8b3" containerName="extract-utilities" Feb 16 19:57:52 crc kubenswrapper[4675]: I0216 19:57:52.731609 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="0771373e-73e4-439b-82b6-fda6035cb8b3" containerName="extract-utilities" Feb 16 19:57:52 crc kubenswrapper[4675]: E0216 19:57:52.731764 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0771373e-73e4-439b-82b6-fda6035cb8b3" containerName="registry-server" Feb 16 19:57:52 crc kubenswrapper[4675]: I0216 19:57:52.731894 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="0771373e-73e4-439b-82b6-fda6035cb8b3" containerName="registry-server" Feb 16 19:57:52 crc kubenswrapper[4675]: I0216 19:57:52.732212 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="0771373e-73e4-439b-82b6-fda6035cb8b3" containerName="registry-server" Feb 16 19:57:52 crc kubenswrapper[4675]: I0216 19:57:52.732344 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="f23a915d-2f12-4529-995b-0a103d0f97a2" containerName="registry-server" Feb 16 19:57:52 crc kubenswrapper[4675]: I0216 19:57:52.735583 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kb5mt" Feb 16 19:57:52 crc kubenswrapper[4675]: I0216 19:57:52.746190 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kb5mt"] Feb 16 19:57:52 crc kubenswrapper[4675]: I0216 19:57:52.854048 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26b67419-8bf5-4b29-af24-c1d2db534685-catalog-content\") pod \"redhat-marketplace-kb5mt\" (UID: \"26b67419-8bf5-4b29-af24-c1d2db534685\") " pod="openshift-marketplace/redhat-marketplace-kb5mt" Feb 16 19:57:52 crc kubenswrapper[4675]: I0216 19:57:52.854115 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8dnp\" (UniqueName: \"kubernetes.io/projected/26b67419-8bf5-4b29-af24-c1d2db534685-kube-api-access-q8dnp\") pod \"redhat-marketplace-kb5mt\" (UID: \"26b67419-8bf5-4b29-af24-c1d2db534685\") " pod="openshift-marketplace/redhat-marketplace-kb5mt" Feb 16 19:57:52 crc kubenswrapper[4675]: I0216 19:57:52.854143 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26b67419-8bf5-4b29-af24-c1d2db534685-utilities\") pod \"redhat-marketplace-kb5mt\" (UID: \"26b67419-8bf5-4b29-af24-c1d2db534685\") " pod="openshift-marketplace/redhat-marketplace-kb5mt" Feb 16 19:57:52 crc kubenswrapper[4675]: I0216 19:57:52.955418 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26b67419-8bf5-4b29-af24-c1d2db534685-catalog-content\") pod \"redhat-marketplace-kb5mt\" (UID: \"26b67419-8bf5-4b29-af24-c1d2db534685\") " pod="openshift-marketplace/redhat-marketplace-kb5mt" Feb 16 19:57:52 crc kubenswrapper[4675]: I0216 19:57:52.955488 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8dnp\" (UniqueName: \"kubernetes.io/projected/26b67419-8bf5-4b29-af24-c1d2db534685-kube-api-access-q8dnp\") pod \"redhat-marketplace-kb5mt\" (UID: \"26b67419-8bf5-4b29-af24-c1d2db534685\") " pod="openshift-marketplace/redhat-marketplace-kb5mt" Feb 16 19:57:52 crc kubenswrapper[4675]: I0216 19:57:52.955526 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26b67419-8bf5-4b29-af24-c1d2db534685-utilities\") pod \"redhat-marketplace-kb5mt\" (UID: \"26b67419-8bf5-4b29-af24-c1d2db534685\") " pod="openshift-marketplace/redhat-marketplace-kb5mt" Feb 16 19:57:52 crc kubenswrapper[4675]: I0216 19:57:52.956193 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26b67419-8bf5-4b29-af24-c1d2db534685-catalog-content\") pod \"redhat-marketplace-kb5mt\" (UID: \"26b67419-8bf5-4b29-af24-c1d2db534685\") " pod="openshift-marketplace/redhat-marketplace-kb5mt" Feb 16 19:57:52 crc kubenswrapper[4675]: I0216 19:57:52.956252 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26b67419-8bf5-4b29-af24-c1d2db534685-utilities\") pod \"redhat-marketplace-kb5mt\" (UID: \"26b67419-8bf5-4b29-af24-c1d2db534685\") " pod="openshift-marketplace/redhat-marketplace-kb5mt" Feb 16 19:57:52 crc kubenswrapper[4675]: I0216 19:57:52.987477 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8dnp\" (UniqueName: \"kubernetes.io/projected/26b67419-8bf5-4b29-af24-c1d2db534685-kube-api-access-q8dnp\") pod \"redhat-marketplace-kb5mt\" (UID: \"26b67419-8bf5-4b29-af24-c1d2db534685\") " pod="openshift-marketplace/redhat-marketplace-kb5mt" Feb 16 19:57:53 crc kubenswrapper[4675]: I0216 19:57:53.057571 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kb5mt" Feb 16 19:57:53 crc kubenswrapper[4675]: I0216 19:57:53.298060 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kb5mt"] Feb 16 19:57:53 crc kubenswrapper[4675]: I0216 19:57:53.565800 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kb5mt" event={"ID":"26b67419-8bf5-4b29-af24-c1d2db534685","Type":"ContainerStarted","Data":"371bf9d9827e0b04e8fc2af3374a3009ba50046bcf2b2feb6afafa243ed0ad42"} Feb 16 19:57:53 crc kubenswrapper[4675]: I0216 19:57:53.566295 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kb5mt" event={"ID":"26b67419-8bf5-4b29-af24-c1d2db534685","Type":"ContainerStarted","Data":"c2997af33e3a44d91129ca5f766c667ef67ccf7624cf96efa7afdb32ce5aa173"} Feb 16 19:57:54 crc kubenswrapper[4675]: I0216 19:57:54.576937 4675 generic.go:334] "Generic (PLEG): container finished" podID="26b67419-8bf5-4b29-af24-c1d2db534685" containerID="371bf9d9827e0b04e8fc2af3374a3009ba50046bcf2b2feb6afafa243ed0ad42" exitCode=0 Feb 16 19:57:54 crc kubenswrapper[4675]: I0216 19:57:54.577008 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kb5mt" event={"ID":"26b67419-8bf5-4b29-af24-c1d2db534685","Type":"ContainerDied","Data":"371bf9d9827e0b04e8fc2af3374a3009ba50046bcf2b2feb6afafa243ed0ad42"} Feb 16 19:57:55 crc kubenswrapper[4675]: I0216 19:57:55.584396 4675 generic.go:334] "Generic (PLEG): container finished" podID="26b67419-8bf5-4b29-af24-c1d2db534685" containerID="056f5f06604302b0ca254dab5458091c0c6459f6d7ce0d2470ea58ea5e87c5e6" exitCode=0 Feb 16 19:57:55 crc kubenswrapper[4675]: I0216 19:57:55.584476 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kb5mt" event={"ID":"26b67419-8bf5-4b29-af24-c1d2db534685","Type":"ContainerDied","Data":"056f5f06604302b0ca254dab5458091c0c6459f6d7ce0d2470ea58ea5e87c5e6"} Feb 16 19:57:56 crc kubenswrapper[4675]: I0216 19:57:56.337246 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-25zqb" Feb 16 19:57:56 crc kubenswrapper[4675]: I0216 19:57:56.593643 4675 generic.go:334] "Generic (PLEG): container finished" podID="97914e45-9fe0-4c19-8aae-2ade5f9afd1a" containerID="a6ba6da673eb1b658d7c83b42fae66a78bc838f18c85356b9c5bcb67ebfb96c5" exitCode=0 Feb 16 19:57:56 crc kubenswrapper[4675]: I0216 19:57:56.593735 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wfqzm/must-gather-827k7" event={"ID":"97914e45-9fe0-4c19-8aae-2ade5f9afd1a","Type":"ContainerDied","Data":"a6ba6da673eb1b658d7c83b42fae66a78bc838f18c85356b9c5bcb67ebfb96c5"} Feb 16 19:57:56 crc kubenswrapper[4675]: I0216 19:57:56.594206 4675 scope.go:117] "RemoveContainer" containerID="a6ba6da673eb1b658d7c83b42fae66a78bc838f18c85356b9c5bcb67ebfb96c5" Feb 16 19:57:56 crc kubenswrapper[4675]: I0216 19:57:56.596514 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kb5mt" event={"ID":"26b67419-8bf5-4b29-af24-c1d2db534685","Type":"ContainerStarted","Data":"4988b48d835259cd4b889eb03740f6fba6e4ab140e7cf271beeca21e7b6b84c4"} Feb 16 19:57:57 crc kubenswrapper[4675]: I0216 19:57:57.425289 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-wfqzm_must-gather-827k7_97914e45-9fe0-4c19-8aae-2ade5f9afd1a/gather/0.log" Feb 16 19:57:59 crc kubenswrapper[4675]: I0216 19:57:59.525738 4675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-kb5mt" podStartSLOduration=6.117857175 podStartE2EDuration="7.525716633s" podCreationTimestamp="2026-02-16 19:57:52 +0000 UTC" firstStartedPulling="2026-02-16 19:57:54.580072283 +0000 UTC m=+957.705361879" lastFinishedPulling="2026-02-16 19:57:55.987931781 +0000 UTC m=+959.113221337" observedRunningTime="2026-02-16 19:57:56.635168238 +0000 UTC m=+959.760457784" watchObservedRunningTime="2026-02-16 19:57:59.525716633 +0000 UTC m=+962.651006199" Feb 16 19:57:59 crc kubenswrapper[4675]: I0216 19:57:59.531252 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-25zqb"] Feb 16 19:57:59 crc kubenswrapper[4675]: I0216 19:57:59.901086 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8hb2k"] Feb 16 19:57:59 crc kubenswrapper[4675]: I0216 19:57:59.901466 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8hb2k" podUID="1984d92c-2f8f-431e-9006-2a8e14bad660" containerName="registry-server" containerID="cri-o://b9f83c5d8a9e736004f1db5c1460175df31ed9aae7bd5bace490c304e8ba7e13" gracePeriod=2 Feb 16 19:58:00 crc kubenswrapper[4675]: I0216 19:58:00.263452 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8hb2k" Feb 16 19:58:00 crc kubenswrapper[4675]: I0216 19:58:00.465412 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1984d92c-2f8f-431e-9006-2a8e14bad660-catalog-content\") pod \"1984d92c-2f8f-431e-9006-2a8e14bad660\" (UID: \"1984d92c-2f8f-431e-9006-2a8e14bad660\") " Feb 16 19:58:00 crc kubenswrapper[4675]: I0216 19:58:00.465519 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gml4p\" (UniqueName: \"kubernetes.io/projected/1984d92c-2f8f-431e-9006-2a8e14bad660-kube-api-access-gml4p\") pod \"1984d92c-2f8f-431e-9006-2a8e14bad660\" (UID: \"1984d92c-2f8f-431e-9006-2a8e14bad660\") " Feb 16 19:58:00 crc kubenswrapper[4675]: I0216 19:58:00.465555 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1984d92c-2f8f-431e-9006-2a8e14bad660-utilities\") pod \"1984d92c-2f8f-431e-9006-2a8e14bad660\" (UID: \"1984d92c-2f8f-431e-9006-2a8e14bad660\") " Feb 16 19:58:00 crc kubenswrapper[4675]: I0216 19:58:00.466912 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1984d92c-2f8f-431e-9006-2a8e14bad660-utilities" (OuterVolumeSpecName: "utilities") pod "1984d92c-2f8f-431e-9006-2a8e14bad660" (UID: "1984d92c-2f8f-431e-9006-2a8e14bad660"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 19:58:00 crc kubenswrapper[4675]: I0216 19:58:00.474780 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1984d92c-2f8f-431e-9006-2a8e14bad660-kube-api-access-gml4p" (OuterVolumeSpecName: "kube-api-access-gml4p") pod "1984d92c-2f8f-431e-9006-2a8e14bad660" (UID: "1984d92c-2f8f-431e-9006-2a8e14bad660"). InnerVolumeSpecName "kube-api-access-gml4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 19:58:00 crc kubenswrapper[4675]: I0216 19:58:00.519931 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1984d92c-2f8f-431e-9006-2a8e14bad660-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1984d92c-2f8f-431e-9006-2a8e14bad660" (UID: "1984d92c-2f8f-431e-9006-2a8e14bad660"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 19:58:00 crc kubenswrapper[4675]: I0216 19:58:00.567934 4675 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1984d92c-2f8f-431e-9006-2a8e14bad660-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 16 19:58:00 crc kubenswrapper[4675]: I0216 19:58:00.568017 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gml4p\" (UniqueName: \"kubernetes.io/projected/1984d92c-2f8f-431e-9006-2a8e14bad660-kube-api-access-gml4p\") on node \"crc\" DevicePath \"\"" Feb 16 19:58:00 crc kubenswrapper[4675]: I0216 19:58:00.568035 4675 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1984d92c-2f8f-431e-9006-2a8e14bad660-utilities\") on node \"crc\" DevicePath \"\"" Feb 16 19:58:00 crc kubenswrapper[4675]: I0216 19:58:00.644051 4675 generic.go:334] "Generic (PLEG): container finished" podID="1984d92c-2f8f-431e-9006-2a8e14bad660" containerID="b9f83c5d8a9e736004f1db5c1460175df31ed9aae7bd5bace490c304e8ba7e13" exitCode=0 Feb 16 19:58:00 crc kubenswrapper[4675]: I0216 19:58:00.644077 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8hb2k" Feb 16 19:58:00 crc kubenswrapper[4675]: I0216 19:58:00.644120 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8hb2k" event={"ID":"1984d92c-2f8f-431e-9006-2a8e14bad660","Type":"ContainerDied","Data":"b9f83c5d8a9e736004f1db5c1460175df31ed9aae7bd5bace490c304e8ba7e13"} Feb 16 19:58:00 crc kubenswrapper[4675]: I0216 19:58:00.644193 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8hb2k" event={"ID":"1984d92c-2f8f-431e-9006-2a8e14bad660","Type":"ContainerDied","Data":"24b927a4145a3fb832b93a4e9b62e5e47f7c9b646933dc2c8800a167b293beb2"} Feb 16 19:58:00 crc kubenswrapper[4675]: I0216 19:58:00.644221 4675 scope.go:117] "RemoveContainer" containerID="b9f83c5d8a9e736004f1db5c1460175df31ed9aae7bd5bace490c304e8ba7e13" Feb 16 19:58:00 crc kubenswrapper[4675]: I0216 19:58:00.664822 4675 scope.go:117] "RemoveContainer" containerID="4bac1baa3f43b1e918b5f6db17874ed74f6ea9179c8e9b0c502c9172e3887193" Feb 16 19:58:00 crc kubenswrapper[4675]: I0216 19:58:00.674681 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8hb2k"] Feb 16 19:58:00 crc kubenswrapper[4675]: I0216 19:58:00.683001 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8hb2k"] Feb 16 19:58:00 crc kubenswrapper[4675]: I0216 19:58:00.698458 4675 scope.go:117] "RemoveContainer" containerID="8ef9e8b2d340c6342d422ce3fde16a743665fcfa5e27ba89f5578dc104430e36" Feb 16 19:58:00 crc kubenswrapper[4675]: I0216 19:58:00.722500 4675 scope.go:117] "RemoveContainer" containerID="b9f83c5d8a9e736004f1db5c1460175df31ed9aae7bd5bace490c304e8ba7e13" Feb 16 19:58:00 crc kubenswrapper[4675]: E0216 19:58:00.723844 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9f83c5d8a9e736004f1db5c1460175df31ed9aae7bd5bace490c304e8ba7e13\": container with ID starting with b9f83c5d8a9e736004f1db5c1460175df31ed9aae7bd5bace490c304e8ba7e13 not found: ID does not exist" containerID="b9f83c5d8a9e736004f1db5c1460175df31ed9aae7bd5bace490c304e8ba7e13" Feb 16 19:58:00 crc kubenswrapper[4675]: I0216 19:58:00.723919 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9f83c5d8a9e736004f1db5c1460175df31ed9aae7bd5bace490c304e8ba7e13"} err="failed to get container status \"b9f83c5d8a9e736004f1db5c1460175df31ed9aae7bd5bace490c304e8ba7e13\": rpc error: code = NotFound desc = could not find container \"b9f83c5d8a9e736004f1db5c1460175df31ed9aae7bd5bace490c304e8ba7e13\": container with ID starting with b9f83c5d8a9e736004f1db5c1460175df31ed9aae7bd5bace490c304e8ba7e13 not found: ID does not exist" Feb 16 19:58:00 crc kubenswrapper[4675]: I0216 19:58:00.723953 4675 scope.go:117] "RemoveContainer" containerID="4bac1baa3f43b1e918b5f6db17874ed74f6ea9179c8e9b0c502c9172e3887193" Feb 16 19:58:00 crc kubenswrapper[4675]: E0216 19:58:00.724417 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4bac1baa3f43b1e918b5f6db17874ed74f6ea9179c8e9b0c502c9172e3887193\": container with ID starting with 4bac1baa3f43b1e918b5f6db17874ed74f6ea9179c8e9b0c502c9172e3887193 not found: ID does not exist" containerID="4bac1baa3f43b1e918b5f6db17874ed74f6ea9179c8e9b0c502c9172e3887193" Feb 16 19:58:00 crc kubenswrapper[4675]: I0216 19:58:00.724480 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bac1baa3f43b1e918b5f6db17874ed74f6ea9179c8e9b0c502c9172e3887193"} err="failed to get container status \"4bac1baa3f43b1e918b5f6db17874ed74f6ea9179c8e9b0c502c9172e3887193\": rpc error: code = NotFound desc = could not find container \"4bac1baa3f43b1e918b5f6db17874ed74f6ea9179c8e9b0c502c9172e3887193\": container with ID starting with 4bac1baa3f43b1e918b5f6db17874ed74f6ea9179c8e9b0c502c9172e3887193 not found: ID does not exist" Feb 16 19:58:00 crc kubenswrapper[4675]: I0216 19:58:00.724508 4675 scope.go:117] "RemoveContainer" containerID="8ef9e8b2d340c6342d422ce3fde16a743665fcfa5e27ba89f5578dc104430e36" Feb 16 19:58:00 crc kubenswrapper[4675]: E0216 19:58:00.725023 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ef9e8b2d340c6342d422ce3fde16a743665fcfa5e27ba89f5578dc104430e36\": container with ID starting with 8ef9e8b2d340c6342d422ce3fde16a743665fcfa5e27ba89f5578dc104430e36 not found: ID does not exist" containerID="8ef9e8b2d340c6342d422ce3fde16a743665fcfa5e27ba89f5578dc104430e36" Feb 16 19:58:00 crc kubenswrapper[4675]: I0216 19:58:00.725051 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ef9e8b2d340c6342d422ce3fde16a743665fcfa5e27ba89f5578dc104430e36"} err="failed to get container status \"8ef9e8b2d340c6342d422ce3fde16a743665fcfa5e27ba89f5578dc104430e36\": rpc error: code = NotFound desc = could not find container \"8ef9e8b2d340c6342d422ce3fde16a743665fcfa5e27ba89f5578dc104430e36\": container with ID starting with 8ef9e8b2d340c6342d422ce3fde16a743665fcfa5e27ba89f5578dc104430e36 not found: ID does not exist" Feb 16 19:58:01 crc kubenswrapper[4675]: I0216 19:58:01.893873 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1984d92c-2f8f-431e-9006-2a8e14bad660" path="/var/lib/kubelet/pods/1984d92c-2f8f-431e-9006-2a8e14bad660/volumes" Feb 16 19:58:03 crc kubenswrapper[4675]: I0216 19:58:03.059147 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-kb5mt" Feb 16 19:58:03 crc kubenswrapper[4675]: I0216 19:58:03.060063 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-kb5mt" Feb 16 19:58:03 crc kubenswrapper[4675]: I0216 19:58:03.135933 4675 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-kb5mt" Feb 16 19:58:03 crc kubenswrapper[4675]: I0216 19:58:03.729760 4675 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-kb5mt" Feb 16 19:58:04 crc kubenswrapper[4675]: I0216 19:58:04.395567 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-wfqzm/must-gather-827k7"] Feb 16 19:58:04 crc kubenswrapper[4675]: I0216 19:58:04.396313 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-wfqzm/must-gather-827k7" podUID="97914e45-9fe0-4c19-8aae-2ade5f9afd1a" containerName="copy" containerID="cri-o://536b66f204ebfaf9473337dce75923dca5e7e96d7df52b5f7a3c0913f10bbbe2" gracePeriod=2 Feb 16 19:58:04 crc kubenswrapper[4675]: I0216 19:58:04.403109 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-wfqzm/must-gather-827k7"] Feb 16 19:58:04 crc kubenswrapper[4675]: I0216 19:58:04.677382 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-wfqzm_must-gather-827k7_97914e45-9fe0-4c19-8aae-2ade5f9afd1a/copy/0.log" Feb 16 19:58:04 crc kubenswrapper[4675]: I0216 19:58:04.678029 4675 generic.go:334] "Generic (PLEG): container finished" podID="97914e45-9fe0-4c19-8aae-2ade5f9afd1a" containerID="536b66f204ebfaf9473337dce75923dca5e7e96d7df52b5f7a3c0913f10bbbe2" exitCode=143 Feb 16 19:58:04 crc kubenswrapper[4675]: I0216 19:58:04.765518 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-wfqzm_must-gather-827k7_97914e45-9fe0-4c19-8aae-2ade5f9afd1a/copy/0.log" Feb 16 19:58:04 crc kubenswrapper[4675]: I0216 19:58:04.766409 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wfqzm/must-gather-827k7" Feb 16 19:58:04 crc kubenswrapper[4675]: I0216 19:58:04.933036 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mwhd5\" (UniqueName: \"kubernetes.io/projected/97914e45-9fe0-4c19-8aae-2ade5f9afd1a-kube-api-access-mwhd5\") pod \"97914e45-9fe0-4c19-8aae-2ade5f9afd1a\" (UID: \"97914e45-9fe0-4c19-8aae-2ade5f9afd1a\") " Feb 16 19:58:04 crc kubenswrapper[4675]: I0216 19:58:04.933105 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/97914e45-9fe0-4c19-8aae-2ade5f9afd1a-must-gather-output\") pod \"97914e45-9fe0-4c19-8aae-2ade5f9afd1a\" (UID: \"97914e45-9fe0-4c19-8aae-2ade5f9afd1a\") " Feb 16 19:58:04 crc kubenswrapper[4675]: I0216 19:58:04.942850 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97914e45-9fe0-4c19-8aae-2ade5f9afd1a-kube-api-access-mwhd5" (OuterVolumeSpecName: "kube-api-access-mwhd5") pod "97914e45-9fe0-4c19-8aae-2ade5f9afd1a" (UID: "97914e45-9fe0-4c19-8aae-2ade5f9afd1a"). InnerVolumeSpecName "kube-api-access-mwhd5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 19:58:04 crc kubenswrapper[4675]: I0216 19:58:04.986800 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97914e45-9fe0-4c19-8aae-2ade5f9afd1a-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "97914e45-9fe0-4c19-8aae-2ade5f9afd1a" (UID: "97914e45-9fe0-4c19-8aae-2ade5f9afd1a"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 19:58:05 crc kubenswrapper[4675]: I0216 19:58:05.034598 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mwhd5\" (UniqueName: \"kubernetes.io/projected/97914e45-9fe0-4c19-8aae-2ade5f9afd1a-kube-api-access-mwhd5\") on node \"crc\" DevicePath \"\"" Feb 16 19:58:05 crc kubenswrapper[4675]: I0216 19:58:05.034649 4675 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/97914e45-9fe0-4c19-8aae-2ade5f9afd1a-must-gather-output\") on node \"crc\" DevicePath \"\"" Feb 16 19:58:05 crc kubenswrapper[4675]: I0216 19:58:05.476992 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kb5mt"] Feb 16 19:58:05 crc kubenswrapper[4675]: I0216 19:58:05.685982 4675 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-wfqzm_must-gather-827k7_97914e45-9fe0-4c19-8aae-2ade5f9afd1a/copy/0.log" Feb 16 19:58:05 crc kubenswrapper[4675]: I0216 19:58:05.686525 4675 scope.go:117] "RemoveContainer" containerID="536b66f204ebfaf9473337dce75923dca5e7e96d7df52b5f7a3c0913f10bbbe2" Feb 16 19:58:05 crc kubenswrapper[4675]: I0216 19:58:05.686566 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wfqzm/must-gather-827k7" Feb 16 19:58:05 crc kubenswrapper[4675]: I0216 19:58:05.711740 4675 scope.go:117] "RemoveContainer" containerID="a6ba6da673eb1b658d7c83b42fae66a78bc838f18c85356b9c5bcb67ebfb96c5" Feb 16 19:58:05 crc kubenswrapper[4675]: I0216 19:58:05.897246 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97914e45-9fe0-4c19-8aae-2ade5f9afd1a" path="/var/lib/kubelet/pods/97914e45-9fe0-4c19-8aae-2ade5f9afd1a/volumes" Feb 16 19:58:06 crc kubenswrapper[4675]: I0216 19:58:06.696433 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-kb5mt" podUID="26b67419-8bf5-4b29-af24-c1d2db534685" containerName="registry-server" containerID="cri-o://4988b48d835259cd4b889eb03740f6fba6e4ab140e7cf271beeca21e7b6b84c4" gracePeriod=2 Feb 16 19:58:07 crc kubenswrapper[4675]: I0216 19:58:07.145869 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kb5mt" Feb 16 19:58:07 crc kubenswrapper[4675]: I0216 19:58:07.271151 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8dnp\" (UniqueName: \"kubernetes.io/projected/26b67419-8bf5-4b29-af24-c1d2db534685-kube-api-access-q8dnp\") pod \"26b67419-8bf5-4b29-af24-c1d2db534685\" (UID: \"26b67419-8bf5-4b29-af24-c1d2db534685\") " Feb 16 19:58:07 crc kubenswrapper[4675]: I0216 19:58:07.271441 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26b67419-8bf5-4b29-af24-c1d2db534685-utilities\") pod \"26b67419-8bf5-4b29-af24-c1d2db534685\" (UID: \"26b67419-8bf5-4b29-af24-c1d2db534685\") " Feb 16 19:58:07 crc kubenswrapper[4675]: I0216 19:58:07.271519 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26b67419-8bf5-4b29-af24-c1d2db534685-catalog-content\") pod \"26b67419-8bf5-4b29-af24-c1d2db534685\" (UID: \"26b67419-8bf5-4b29-af24-c1d2db534685\") " Feb 16 19:58:07 crc kubenswrapper[4675]: I0216 19:58:07.273321 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26b67419-8bf5-4b29-af24-c1d2db534685-utilities" (OuterVolumeSpecName: "utilities") pod "26b67419-8bf5-4b29-af24-c1d2db534685" (UID: "26b67419-8bf5-4b29-af24-c1d2db534685"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 19:58:07 crc kubenswrapper[4675]: I0216 19:58:07.277946 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26b67419-8bf5-4b29-af24-c1d2db534685-kube-api-access-q8dnp" (OuterVolumeSpecName: "kube-api-access-q8dnp") pod "26b67419-8bf5-4b29-af24-c1d2db534685" (UID: "26b67419-8bf5-4b29-af24-c1d2db534685"). InnerVolumeSpecName "kube-api-access-q8dnp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 19:58:07 crc kubenswrapper[4675]: I0216 19:58:07.330738 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26b67419-8bf5-4b29-af24-c1d2db534685-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "26b67419-8bf5-4b29-af24-c1d2db534685" (UID: "26b67419-8bf5-4b29-af24-c1d2db534685"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 19:58:07 crc kubenswrapper[4675]: I0216 19:58:07.373674 4675 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26b67419-8bf5-4b29-af24-c1d2db534685-utilities\") on node \"crc\" DevicePath \"\"" Feb 16 19:58:07 crc kubenswrapper[4675]: I0216 19:58:07.373767 4675 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26b67419-8bf5-4b29-af24-c1d2db534685-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 16 19:58:07 crc kubenswrapper[4675]: I0216 19:58:07.373792 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8dnp\" (UniqueName: \"kubernetes.io/projected/26b67419-8bf5-4b29-af24-c1d2db534685-kube-api-access-q8dnp\") on node \"crc\" DevicePath \"\"" Feb 16 19:58:07 crc kubenswrapper[4675]: I0216 19:58:07.710171 4675 generic.go:334] "Generic (PLEG): container finished" podID="26b67419-8bf5-4b29-af24-c1d2db534685" containerID="4988b48d835259cd4b889eb03740f6fba6e4ab140e7cf271beeca21e7b6b84c4" exitCode=0 Feb 16 19:58:07 crc kubenswrapper[4675]: I0216 19:58:07.710239 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kb5mt" event={"ID":"26b67419-8bf5-4b29-af24-c1d2db534685","Type":"ContainerDied","Data":"4988b48d835259cd4b889eb03740f6fba6e4ab140e7cf271beeca21e7b6b84c4"} Feb 16 19:58:07 crc kubenswrapper[4675]: I0216 19:58:07.710278 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kb5mt" event={"ID":"26b67419-8bf5-4b29-af24-c1d2db534685","Type":"ContainerDied","Data":"c2997af33e3a44d91129ca5f766c667ef67ccf7624cf96efa7afdb32ce5aa173"} Feb 16 19:58:07 crc kubenswrapper[4675]: I0216 19:58:07.710308 4675 scope.go:117] "RemoveContainer" containerID="4988b48d835259cd4b889eb03740f6fba6e4ab140e7cf271beeca21e7b6b84c4" Feb 16 19:58:07 crc kubenswrapper[4675]: I0216 19:58:07.710446 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kb5mt" Feb 16 19:58:07 crc kubenswrapper[4675]: I0216 19:58:07.736703 4675 scope.go:117] "RemoveContainer" containerID="056f5f06604302b0ca254dab5458091c0c6459f6d7ce0d2470ea58ea5e87c5e6" Feb 16 19:58:07 crc kubenswrapper[4675]: I0216 19:58:07.768210 4675 scope.go:117] "RemoveContainer" containerID="371bf9d9827e0b04e8fc2af3374a3009ba50046bcf2b2feb6afafa243ed0ad42" Feb 16 19:58:07 crc kubenswrapper[4675]: I0216 19:58:07.775200 4675 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kb5mt"] Feb 16 19:58:07 crc kubenswrapper[4675]: I0216 19:58:07.781080 4675 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-kb5mt"] Feb 16 19:58:07 crc kubenswrapper[4675]: I0216 19:58:07.792050 4675 scope.go:117] "RemoveContainer" containerID="4988b48d835259cd4b889eb03740f6fba6e4ab140e7cf271beeca21e7b6b84c4" Feb 16 19:58:07 crc kubenswrapper[4675]: E0216 19:58:07.793960 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4988b48d835259cd4b889eb03740f6fba6e4ab140e7cf271beeca21e7b6b84c4\": container with ID starting with 4988b48d835259cd4b889eb03740f6fba6e4ab140e7cf271beeca21e7b6b84c4 not found: ID does not exist" containerID="4988b48d835259cd4b889eb03740f6fba6e4ab140e7cf271beeca21e7b6b84c4" Feb 16 19:58:07 crc kubenswrapper[4675]: I0216 19:58:07.794036 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4988b48d835259cd4b889eb03740f6fba6e4ab140e7cf271beeca21e7b6b84c4"} err="failed to get container status \"4988b48d835259cd4b889eb03740f6fba6e4ab140e7cf271beeca21e7b6b84c4\": rpc error: code = NotFound desc = could not find container \"4988b48d835259cd4b889eb03740f6fba6e4ab140e7cf271beeca21e7b6b84c4\": container with ID starting with 4988b48d835259cd4b889eb03740f6fba6e4ab140e7cf271beeca21e7b6b84c4 not found: ID does not exist" Feb 16 19:58:07 crc kubenswrapper[4675]: I0216 19:58:07.794086 4675 scope.go:117] "RemoveContainer" containerID="056f5f06604302b0ca254dab5458091c0c6459f6d7ce0d2470ea58ea5e87c5e6" Feb 16 19:58:07 crc kubenswrapper[4675]: E0216 19:58:07.798411 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"056f5f06604302b0ca254dab5458091c0c6459f6d7ce0d2470ea58ea5e87c5e6\": container with ID starting with 056f5f06604302b0ca254dab5458091c0c6459f6d7ce0d2470ea58ea5e87c5e6 not found: ID does not exist" containerID="056f5f06604302b0ca254dab5458091c0c6459f6d7ce0d2470ea58ea5e87c5e6" Feb 16 19:58:07 crc kubenswrapper[4675]: I0216 19:58:07.798622 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"056f5f06604302b0ca254dab5458091c0c6459f6d7ce0d2470ea58ea5e87c5e6"} err="failed to get container status \"056f5f06604302b0ca254dab5458091c0c6459f6d7ce0d2470ea58ea5e87c5e6\": rpc error: code = NotFound desc = could not find container \"056f5f06604302b0ca254dab5458091c0c6459f6d7ce0d2470ea58ea5e87c5e6\": container with ID starting with 056f5f06604302b0ca254dab5458091c0c6459f6d7ce0d2470ea58ea5e87c5e6 not found: ID does not exist" Feb 16 19:58:07 crc kubenswrapper[4675]: I0216 19:58:07.798803 4675 scope.go:117] "RemoveContainer" containerID="371bf9d9827e0b04e8fc2af3374a3009ba50046bcf2b2feb6afafa243ed0ad42" Feb 16 19:58:07 crc kubenswrapper[4675]: E0216 19:58:07.799340 4675 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"371bf9d9827e0b04e8fc2af3374a3009ba50046bcf2b2feb6afafa243ed0ad42\": container with ID starting with 371bf9d9827e0b04e8fc2af3374a3009ba50046bcf2b2feb6afafa243ed0ad42 not found: ID does not exist" containerID="371bf9d9827e0b04e8fc2af3374a3009ba50046bcf2b2feb6afafa243ed0ad42" Feb 16 19:58:07 crc kubenswrapper[4675]: I0216 19:58:07.799375 4675 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"371bf9d9827e0b04e8fc2af3374a3009ba50046bcf2b2feb6afafa243ed0ad42"} err="failed to get container status \"371bf9d9827e0b04e8fc2af3374a3009ba50046bcf2b2feb6afafa243ed0ad42\": rpc error: code = NotFound desc = could not find container \"371bf9d9827e0b04e8fc2af3374a3009ba50046bcf2b2feb6afafa243ed0ad42\": container with ID starting with 371bf9d9827e0b04e8fc2af3374a3009ba50046bcf2b2feb6afafa243ed0ad42 not found: ID does not exist" Feb 16 19:58:07 crc kubenswrapper[4675]: I0216 19:58:07.897517 4675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26b67419-8bf5-4b29-af24-c1d2db534685" path="/var/lib/kubelet/pods/26b67419-8bf5-4b29-af24-c1d2db534685/volumes" Feb 16 20:00:00 crc kubenswrapper[4675]: I0216 20:00:00.214373 4675 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29521200-ptqhl"] Feb 16 20:00:00 crc kubenswrapper[4675]: E0216 20:00:00.217759 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1984d92c-2f8f-431e-9006-2a8e14bad660" containerName="extract-content" Feb 16 20:00:00 crc kubenswrapper[4675]: I0216 20:00:00.217798 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="1984d92c-2f8f-431e-9006-2a8e14bad660" containerName="extract-content" Feb 16 20:00:00 crc kubenswrapper[4675]: E0216 20:00:00.217815 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26b67419-8bf5-4b29-af24-c1d2db534685" containerName="registry-server" Feb 16 20:00:00 crc kubenswrapper[4675]: I0216 20:00:00.217824 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="26b67419-8bf5-4b29-af24-c1d2db534685" containerName="registry-server" Feb 16 20:00:00 crc kubenswrapper[4675]: E0216 20:00:00.217839 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97914e45-9fe0-4c19-8aae-2ade5f9afd1a" containerName="copy" Feb 16 20:00:00 crc kubenswrapper[4675]: I0216 20:00:00.217849 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="97914e45-9fe0-4c19-8aae-2ade5f9afd1a" containerName="copy" Feb 16 20:00:00 crc kubenswrapper[4675]: E0216 20:00:00.217863 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26b67419-8bf5-4b29-af24-c1d2db534685" containerName="extract-utilities" Feb 16 20:00:00 crc kubenswrapper[4675]: I0216 20:00:00.217871 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="26b67419-8bf5-4b29-af24-c1d2db534685" containerName="extract-utilities" Feb 16 20:00:00 crc kubenswrapper[4675]: E0216 20:00:00.217884 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1984d92c-2f8f-431e-9006-2a8e14bad660" containerName="extract-utilities" Feb 16 20:00:00 crc kubenswrapper[4675]: I0216 20:00:00.217895 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="1984d92c-2f8f-431e-9006-2a8e14bad660" containerName="extract-utilities" Feb 16 20:00:00 crc kubenswrapper[4675]: E0216 20:00:00.217911 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26b67419-8bf5-4b29-af24-c1d2db534685" containerName="extract-content" Feb 16 20:00:00 crc kubenswrapper[4675]: I0216 20:00:00.217922 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="26b67419-8bf5-4b29-af24-c1d2db534685" containerName="extract-content" Feb 16 20:00:00 crc kubenswrapper[4675]: E0216 20:00:00.217946 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97914e45-9fe0-4c19-8aae-2ade5f9afd1a" containerName="gather" Feb 16 20:00:00 crc kubenswrapper[4675]: I0216 20:00:00.217957 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="97914e45-9fe0-4c19-8aae-2ade5f9afd1a" containerName="gather" Feb 16 20:00:00 crc kubenswrapper[4675]: E0216 20:00:00.217975 4675 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1984d92c-2f8f-431e-9006-2a8e14bad660" containerName="registry-server" Feb 16 20:00:00 crc kubenswrapper[4675]: I0216 20:00:00.217986 4675 state_mem.go:107] "Deleted CPUSet assignment" podUID="1984d92c-2f8f-431e-9006-2a8e14bad660" containerName="registry-server" Feb 16 20:00:00 crc kubenswrapper[4675]: I0216 20:00:00.218159 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="26b67419-8bf5-4b29-af24-c1d2db534685" containerName="registry-server" Feb 16 20:00:00 crc kubenswrapper[4675]: I0216 20:00:00.218177 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="97914e45-9fe0-4c19-8aae-2ade5f9afd1a" containerName="copy" Feb 16 20:00:00 crc kubenswrapper[4675]: I0216 20:00:00.218192 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="97914e45-9fe0-4c19-8aae-2ade5f9afd1a" containerName="gather" Feb 16 20:00:00 crc kubenswrapper[4675]: I0216 20:00:00.218208 4675 memory_manager.go:354] "RemoveStaleState removing state" podUID="1984d92c-2f8f-431e-9006-2a8e14bad660" containerName="registry-server" Feb 16 20:00:00 crc kubenswrapper[4675]: I0216 20:00:00.218748 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29521200-ptqhl" Feb 16 20:00:00 crc kubenswrapper[4675]: I0216 20:00:00.221316 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29521200-ptqhl"] Feb 16 20:00:00 crc kubenswrapper[4675]: I0216 20:00:00.224055 4675 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 16 20:00:00 crc kubenswrapper[4675]: I0216 20:00:00.224130 4675 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 16 20:00:00 crc kubenswrapper[4675]: I0216 20:00:00.323375 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2v5g\" (UniqueName: \"kubernetes.io/projected/1fa02817-32b4-4e23-a669-95a2d5d3cd21-kube-api-access-t2v5g\") pod \"collect-profiles-29521200-ptqhl\" (UID: \"1fa02817-32b4-4e23-a669-95a2d5d3cd21\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521200-ptqhl" Feb 16 20:00:00 crc kubenswrapper[4675]: I0216 20:00:00.323660 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1fa02817-32b4-4e23-a669-95a2d5d3cd21-secret-volume\") pod \"collect-profiles-29521200-ptqhl\" (UID: \"1fa02817-32b4-4e23-a669-95a2d5d3cd21\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521200-ptqhl" Feb 16 20:00:00 crc kubenswrapper[4675]: I0216 20:00:00.323760 4675 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1fa02817-32b4-4e23-a669-95a2d5d3cd21-config-volume\") pod \"collect-profiles-29521200-ptqhl\" (UID: \"1fa02817-32b4-4e23-a669-95a2d5d3cd21\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521200-ptqhl" Feb 16 20:00:00 crc kubenswrapper[4675]: I0216 20:00:00.424988 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2v5g\" (UniqueName: \"kubernetes.io/projected/1fa02817-32b4-4e23-a669-95a2d5d3cd21-kube-api-access-t2v5g\") pod \"collect-profiles-29521200-ptqhl\" (UID: \"1fa02817-32b4-4e23-a669-95a2d5d3cd21\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521200-ptqhl" Feb 16 20:00:00 crc kubenswrapper[4675]: I0216 20:00:00.425087 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1fa02817-32b4-4e23-a669-95a2d5d3cd21-secret-volume\") pod \"collect-profiles-29521200-ptqhl\" (UID: \"1fa02817-32b4-4e23-a669-95a2d5d3cd21\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521200-ptqhl" Feb 16 20:00:00 crc kubenswrapper[4675]: I0216 20:00:00.425128 4675 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1fa02817-32b4-4e23-a669-95a2d5d3cd21-config-volume\") pod \"collect-profiles-29521200-ptqhl\" (UID: \"1fa02817-32b4-4e23-a669-95a2d5d3cd21\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521200-ptqhl" Feb 16 20:00:00 crc kubenswrapper[4675]: I0216 20:00:00.426269 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1fa02817-32b4-4e23-a669-95a2d5d3cd21-config-volume\") pod \"collect-profiles-29521200-ptqhl\" (UID: \"1fa02817-32b4-4e23-a669-95a2d5d3cd21\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521200-ptqhl" Feb 16 20:00:00 crc kubenswrapper[4675]: I0216 20:00:00.435118 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1fa02817-32b4-4e23-a669-95a2d5d3cd21-secret-volume\") pod \"collect-profiles-29521200-ptqhl\" (UID: \"1fa02817-32b4-4e23-a669-95a2d5d3cd21\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521200-ptqhl" Feb 16 20:00:00 crc kubenswrapper[4675]: I0216 20:00:00.459072 4675 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2v5g\" (UniqueName: \"kubernetes.io/projected/1fa02817-32b4-4e23-a669-95a2d5d3cd21-kube-api-access-t2v5g\") pod \"collect-profiles-29521200-ptqhl\" (UID: \"1fa02817-32b4-4e23-a669-95a2d5d3cd21\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521200-ptqhl" Feb 16 20:00:00 crc kubenswrapper[4675]: I0216 20:00:00.550814 4675 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29521200-ptqhl" Feb 16 20:00:00 crc kubenswrapper[4675]: I0216 20:00:00.762126 4675 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29521200-ptqhl"] Feb 16 20:00:01 crc kubenswrapper[4675]: I0216 20:00:01.021804 4675 generic.go:334] "Generic (PLEG): container finished" podID="1fa02817-32b4-4e23-a669-95a2d5d3cd21" containerID="11d8cbdd6537b21cfb6f62d7eb9202ca7bdcb9d632b7bb0dd1db2e3afb6ab98c" exitCode=0 Feb 16 20:00:01 crc kubenswrapper[4675]: I0216 20:00:01.021886 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29521200-ptqhl" event={"ID":"1fa02817-32b4-4e23-a669-95a2d5d3cd21","Type":"ContainerDied","Data":"11d8cbdd6537b21cfb6f62d7eb9202ca7bdcb9d632b7bb0dd1db2e3afb6ab98c"} Feb 16 20:00:01 crc kubenswrapper[4675]: I0216 20:00:01.021940 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29521200-ptqhl" event={"ID":"1fa02817-32b4-4e23-a669-95a2d5d3cd21","Type":"ContainerStarted","Data":"398937c8bcdd99a664a3156c8fb6d774064c0cf845cb05e8a471ec80e52f0e02"} Feb 16 20:00:02 crc kubenswrapper[4675]: I0216 20:00:02.361452 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29521200-ptqhl" Feb 16 20:00:02 crc kubenswrapper[4675]: I0216 20:00:02.454711 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2v5g\" (UniqueName: \"kubernetes.io/projected/1fa02817-32b4-4e23-a669-95a2d5d3cd21-kube-api-access-t2v5g\") pod \"1fa02817-32b4-4e23-a669-95a2d5d3cd21\" (UID: \"1fa02817-32b4-4e23-a669-95a2d5d3cd21\") " Feb 16 20:00:02 crc kubenswrapper[4675]: I0216 20:00:02.454817 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1fa02817-32b4-4e23-a669-95a2d5d3cd21-secret-volume\") pod \"1fa02817-32b4-4e23-a669-95a2d5d3cd21\" (UID: \"1fa02817-32b4-4e23-a669-95a2d5d3cd21\") " Feb 16 20:00:02 crc kubenswrapper[4675]: I0216 20:00:02.454925 4675 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1fa02817-32b4-4e23-a669-95a2d5d3cd21-config-volume\") pod \"1fa02817-32b4-4e23-a669-95a2d5d3cd21\" (UID: \"1fa02817-32b4-4e23-a669-95a2d5d3cd21\") " Feb 16 20:00:02 crc kubenswrapper[4675]: I0216 20:00:02.455776 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fa02817-32b4-4e23-a669-95a2d5d3cd21-config-volume" (OuterVolumeSpecName: "config-volume") pod "1fa02817-32b4-4e23-a669-95a2d5d3cd21" (UID: "1fa02817-32b4-4e23-a669-95a2d5d3cd21"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 20:00:02 crc kubenswrapper[4675]: I0216 20:00:02.460466 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fa02817-32b4-4e23-a669-95a2d5d3cd21-kube-api-access-t2v5g" (OuterVolumeSpecName: "kube-api-access-t2v5g") pod "1fa02817-32b4-4e23-a669-95a2d5d3cd21" (UID: "1fa02817-32b4-4e23-a669-95a2d5d3cd21"). InnerVolumeSpecName "kube-api-access-t2v5g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 20:00:02 crc kubenswrapper[4675]: I0216 20:00:02.461279 4675 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fa02817-32b4-4e23-a669-95a2d5d3cd21-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1fa02817-32b4-4e23-a669-95a2d5d3cd21" (UID: "1fa02817-32b4-4e23-a669-95a2d5d3cd21"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 20:00:02 crc kubenswrapper[4675]: I0216 20:00:02.556605 4675 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1fa02817-32b4-4e23-a669-95a2d5d3cd21-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 16 20:00:02 crc kubenswrapper[4675]: I0216 20:00:02.556666 4675 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1fa02817-32b4-4e23-a669-95a2d5d3cd21-config-volume\") on node \"crc\" DevicePath \"\"" Feb 16 20:00:02 crc kubenswrapper[4675]: I0216 20:00:02.556740 4675 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2v5g\" (UniqueName: \"kubernetes.io/projected/1fa02817-32b4-4e23-a669-95a2d5d3cd21-kube-api-access-t2v5g\") on node \"crc\" DevicePath \"\"" Feb 16 20:00:03 crc kubenswrapper[4675]: I0216 20:00:03.041483 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29521200-ptqhl" event={"ID":"1fa02817-32b4-4e23-a669-95a2d5d3cd21","Type":"ContainerDied","Data":"398937c8bcdd99a664a3156c8fb6d774064c0cf845cb05e8a471ec80e52f0e02"} Feb 16 20:00:03 crc kubenswrapper[4675]: I0216 20:00:03.041542 4675 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="398937c8bcdd99a664a3156c8fb6d774064c0cf845cb05e8a471ec80e52f0e02" Feb 16 20:00:03 crc kubenswrapper[4675]: I0216 20:00:03.041602 4675 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29521200-ptqhl" Feb 16 20:00:17 crc kubenswrapper[4675]: I0216 20:00:17.554261 4675 patch_prober.go:28] interesting pod/machine-config-daemon-j7pnb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 16 20:00:17 crc kubenswrapper[4675]: I0216 20:00:17.555509 4675 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" podUID="10414964-83d0-4d95-a89f-e3212a8015b5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 16 20:00:47 crc kubenswrapper[4675]: I0216 20:00:47.553854 4675 patch_prober.go:28] interesting pod/machine-config-daemon-j7pnb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 16 20:00:47 crc kubenswrapper[4675]: I0216 20:00:47.554788 4675 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" podUID="10414964-83d0-4d95-a89f-e3212a8015b5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 16 20:01:17 crc kubenswrapper[4675]: I0216 20:01:17.554768 4675 patch_prober.go:28] interesting pod/machine-config-daemon-j7pnb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 16 20:01:17 crc kubenswrapper[4675]: I0216 20:01:17.555612 4675 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" podUID="10414964-83d0-4d95-a89f-e3212a8015b5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 16 20:01:17 crc kubenswrapper[4675]: I0216 20:01:17.555689 4675 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" Feb 16 20:01:17 crc kubenswrapper[4675]: I0216 20:01:17.556818 4675 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ab7abda2b5b56b56a782fcc31aab20f6680bfd7debcd351ca4e30b4d356c1611"} pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 16 20:01:17 crc kubenswrapper[4675]: I0216 20:01:17.556910 4675 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" podUID="10414964-83d0-4d95-a89f-e3212a8015b5" containerName="machine-config-daemon" containerID="cri-o://ab7abda2b5b56b56a782fcc31aab20f6680bfd7debcd351ca4e30b4d356c1611" gracePeriod=600 Feb 16 20:01:18 crc kubenswrapper[4675]: I0216 20:01:18.589352 4675 generic.go:334] "Generic (PLEG): container finished" podID="10414964-83d0-4d95-a89f-e3212a8015b5" containerID="ab7abda2b5b56b56a782fcc31aab20f6680bfd7debcd351ca4e30b4d356c1611" exitCode=0 Feb 16 20:01:18 crc kubenswrapper[4675]: I0216 20:01:18.589472 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" event={"ID":"10414964-83d0-4d95-a89f-e3212a8015b5","Type":"ContainerDied","Data":"ab7abda2b5b56b56a782fcc31aab20f6680bfd7debcd351ca4e30b4d356c1611"} Feb 16 20:01:18 crc kubenswrapper[4675]: I0216 20:01:18.590143 4675 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j7pnb" event={"ID":"10414964-83d0-4d95-a89f-e3212a8015b5","Type":"ContainerStarted","Data":"c4dbb90c6a67068ac3900562867c9d183c5eaa5a1a9d1d7358958aaad6525bf0"} Feb 16 20:01:18 crc kubenswrapper[4675]: I0216 20:01:18.590173 4675 scope.go:117] "RemoveContainer" containerID="5ecd42ebdf737873cb9cde7b416996109a5a4903376f505790f6ade57f8714b6" var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515144673705024461 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015144673705017376 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015144671101016505 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015144671101015455 5ustar corecore